Skip to content

Commit

Permalink
Update Modin* name (#2295)
Browse files Browse the repository at this point in the history
* Update README.md

Update Modin* name by Stefana Raileanu

* Update README.md

Update Modin* name by Stefana Raileanu

* Update sample.json

Update Modin* name by Stefana Raileanu

* Update README.md

Update Modin* name by Stefana Raileanu

* Update sample.json
  • Loading branch information
raistefintel authored May 2, 2024
1 parent 9fa0a0f commit ccd57a1
Show file tree
Hide file tree
Showing 5 changed files with 18 additions and 18 deletions.
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# Modin Get Started Sample
# Modin* Get Started Sample

The `Modin Getting Started` sample demonstrates how to use distributed Pandas using the Modin package.
The `Modin* Getting Started` sample demonstrates how to use distributed Pandas using the Modin package.

| Area | Description
| :--- | :---
| Category | Getting Started
| What you will learn | Basic Modin programming model for Intel processors
| What you will learn | Basic Modin* programming model for Intel processors
| Time to complete | 5 to 8 minutes

## Purpose
Expand Down Expand Up @@ -52,7 +52,7 @@ This get started sample code is implemented for CPU using the Python language. T
conda install ipykernel
python -m ipykernel install --user --name usr_modin
```
## Run the `Modin Get Started` Sample
## Run the `Modin* Get Started` Sample

You can run the Jupyter notebook with the sample code on your local server or download the sample code from the notebook as a Python file and run it locally.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
{
"guid": "AE280EFE-9EB1-406D-B32D-5991F707E195",
"name": "Intel® Distribution of Modin* Getting Started",
"name": "Modin* Getting Started",
"categories": ["Toolkit/oneAPI AI And Analytics/Getting Started"],
"description": "This sample illustrates how to use Modin accelerated Pandas functions and notes the performance gain when compared to standard Pandas functions",
"description": "This sample illustrates how to use Modin* accelerated Pandas functions and notes the performance gain when compared to standard Pandas functions",
"builder": ["cli"],
"languages": [{"python":{}}],
"os":["linux"],
Expand All @@ -19,7 +19,7 @@
"conda activate intel-aikit-modin",
"pip install -r requirements.txt # Installing notebook's dependencies",
"pip install runipy # Installing 'runipy' for extended abilities to execute the notebook",
"runipy Modin_GettingStarted.ipynb # Test 'Modin is faster than pandas' case",
"runipy Modin_GettingStarted.ipynb # Test 'Modin* is faster than pandas' case",
"MODIN_CPUS=1 runipy Modin_GettingStarted.ipynb # Test 'Modin is slower than pandas' case"
]
}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# Modin Vs. Pandas Performance Sample
# Modin* Vs. Pandas Performance Sample

The `Modin Vs. Pandas Performance` code illustrates how to use Modin* to replace the Pandas API. The sample compares the performance of Modin and the performance of Pandas for specific dataframe operations.
The `Modin* Vs. Pandas Performance` code illustrates how to use Modin* to replace the Pandas API. The sample compares the performance of Modin* and the performance of Pandas for specific dataframe operations.

| Area | Description
|:--- |:---
| Category | Concepts and Functionality
| What you will learn | How to accelerate the Pandas API using Modin.
| What you will learn | How to accelerate the Pandas API using Modin*.
| Time to complete | Less than 10 minutes

## Purpose
Expand All @@ -19,17 +19,17 @@ You can run the sample locally or in Google Colaboratory (Colab).
|:--- |:---
| OS | Ubuntu* 20.04 (or newer)
| Hardware | Intel® Core™ Gen10 Processor <br> Intel® Xeon® Scalable Performance processors
| Software | Intel® Distribution of Modin*
| Software | Modin*

## Key Implementation Details

This code sample is implemented for CPU using Python programming language. The sample requires NumPy, Pandas, Modin libraries, and the time module in Python.
This code sample is implemented for CPU using Python programming language. The sample requires NumPy, Pandas, Modin* libraries, and the time module in Python.

## Environment Setup

If you want to run the sample on a local system using a command-line interface (CLI), you must install the Modin in a new Conda* environment first.

### Install Modin
### Install Modin*

1. Create a Conda environment.
```
Expand Down Expand Up @@ -65,7 +65,7 @@ If you want to run the sample on a local system using a command-line interface (
ipython Modin_Vs_Pandas.ipynb
```

## Run the `Modin Vs Pandas Performance` Sample in Google Colaboratory
## Run the `Modin* Vs Pandas Performance` Sample in Google Colaboratory

1. Change to the directory containing the `Modin_Vs_Pandas.ipynb` notebook file on your local system.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
{
"guid": "FE479C5C-C7A0-4612-B8D0-F83D07155411",
"name": "Intel® Modin Vs. Pandas Performance",
"name": "Modin* Vs. Pandas Performance",
"categories": ["Toolkit/oneAPI AI And Analytics/Getting Started"],
"description": "This sample code illustrates how Intel® Modin accelerates the performance of Pandas for computational operations on a dataframe.",
"description": "This sample code illustrates how Modin* accelerates the performance of Pandas for computational operations on a dataframe.",
"builder": ["cli"],
"languages": [{
"python": {}
Expand Down
4 changes: 2 additions & 2 deletions AI-and-Analytics/Getting-Started-Samples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@ Third party program Licenses can be found here: [third-party-programs.txt](https
|--------------------------| --------- | ------------------------------------------------ | -
|Inference Optimization| Intel® Neural Compressor (INC) | [Intel® Neural Compressor (INC) Sample-for-PyTorch](INC-Quantization-Sample-for-PyTorch) | Performs INT8 quantization on a Hugging Face BERT model.
|Inference Optimization| Intel® Neural Compressor (INC) | [Intel® Neural Compressor (INC) Sample-for-Tensorflow](INC-Sample-for-Tensorflow) | Quantizes a FP32 model into INT8 by Intel® Neural Compressor (INC) and compares the performance between FP32 and INT8.
|Data Analytics <br/> Classical Machine Learning | Modin | [Modin_GettingStarted](Modin_GettingStarted) | Run Modin-accelerated Pandas functions and note the performance gain.
|Data Analytics <br/> Classical Machine Learning | Modin |[Modin_Vs_Pandas](Modin_Vs_Pandas)| Compares the performance of Intel® Distribution of Modin* and the performance of Pandas.
|Data Analytics <br/> Classical Machine Learning | Modin* | [Modin_GettingStarted](Modin_GettingStarted) | Run Modin*-accelerated Pandas functions and note the performance gain.
|Data Analytics <br/> Classical Machine Learning | Modin* |[Modin_Vs_Pandas](Modin_Vs_Pandas)| Compares the performance of Intel® Distribution of Modin* and the performance of Pandas.
|Classical Machine Learning| Intel® Optimization for XGBoost* | [IntelPython_XGBoost_GettingStarted](IntelPython_XGBoost_GettingStarted) | Set up and trains an XGBoost* model on datasets for prediction.
|Classical Machine Learning| daal4py | [IntelPython_daal4py_GettingStarted](IntelPython_daal4py_GettingStarted) | Batch linear regression using the Python API package daal4py from oneAPI Data Analytics Library (oneDAL).
|Deep Learning <br/> Inference Optimization| Intel® Optimization for TensorFlow* | [IntelTensorFlow_GettingStarted](IntelTensorFlow_GettingStarted) | A simple training example for TensorFlow.
Expand Down

0 comments on commit ccd57a1

Please sign in to comment.