-
Remove any previous Azure ML CLI extension installations
az extension remove -n ml az extension remove -n azure-cli-ml
-
Install the latest Azure CLI for ML, which is in public preview, and then verify installation
az extension add -n ml az ml -h
-
Let's set some defaults for all subsequent "az ml" CLI commands
az account set --subscription <subscription id> az configure --defaults workspace=<azureml workspace name> group=<resource group>
-
For this simple deployment flow, we have following project directory structure:
simple-flow |-- model | |-- conda.yml | |-- sklearn_mnist_model.pkl |-- script | |-- score.py |-- blue-deployment.yml |-- endpoint.yml |-- sample_request.json
As you can see from above, "model" directory contains model and Conda environment definition, "score.py" is under "script" directory. At top level directory, we have endpoint, blue deployment YAML definition and sample request JSON file. In general, this is very typical project setup for Azure Arc enabled ML model deployment.
Now let's see simple deployment flow in action!
-
Git clone preview Github repo and switch to simple-flow directory
git clone https://github.com/Azure/AML-Kubernetes.git cd AML-Kubernetes/examples/inference/simple-flow
-
Modify endpoint YAML file to replace "<your compute target name>" with your own compute target name, and replace "<your instance type>" to the instance type defined in your compute configuration. Create an endpoint with blue deployment with following CLI command, endpoint creation and deployment might take a few minutes.
Note that the resource requirements (CPU, memory, GPU) defined in the endpoint yaml should be no more than the resource limit of the specified instance type.
-
Create endpoint
az ml online-endpoint create --name sklearn-mnist -f endpoint.yml
-
Check status of endpoint
az ml online-endpoint show -n sklearn-mnist
-
Create blue deployment
az ml online-deployment create --name blue --endpoint sklearn-mnist -f blue-deployment.yml --all-traffic
-
Check status of blue deployment
az ml online-deployment show --name blue --endpoint sklearn-mnist
-
Test endpoint by scoring request
az ml online-endpoint invoke -n sklearn-mnist -r sample-request.json
You can also send a scoring request using cURL.
- Obtain a token/keys for the scoring endpoint
az ml online-endpoint get-credentials -n sklearn-mnist
- Obtain the
scoring_uri
of the endpoint
az ml online-endpoint show -n sklearn-mnist
- Score using the token/key obtained above
curl -v -i -X POST -H "Content-Type:application/json" -H "Authorization: Bearer <key_or_token>" -d '<sample_data>' <scoring_uri>
That is it! You have successfully deployed an image classification model and scored the model with a request.
-
Get logs
az ml online-deployment get-logs --name blue --endpoint sklearn-mnist
-
Delete endpoint
az ml online-endpoint delete -n sklearn-mnist
- Deploy model using customer container with built-in model or entry script. In this case, the model and the entry script will not be saved at the cloud, but in local.
- To learn more about Azure ML endpoint and deployment concents, please check Managed Online Endpoints.
- Additional Examples