Deploying a ML service to Azure Container Service (AKS)

ML API as webservice

In this setup we need to create two resource groups using Machine Learning workbench CLI and Azure portal itself. At the end of deployment, we are going to the follow the resource structure as given below. The first resource group SGresourcegrpML (maintaining a naming convention is preferred as there are too many resource items) contains the account, model management and storage which are named accordingly.

Resource Groups
ML components

The second resource group sgresourcegrpml-azureml-82e58 contains the kubernetes clusters comprising of 1 master and 3 agent/slaves nodes. These 4 mater/slaves are basically Linux VM with respective storage, virtual networks and availability set configured

Kubernetes Resources

Step 1

Create a new project in ML workbench or choose the existing one. In our case we have selected the default iris dataset. Open command prompt from the workbench by selecting File>>Open Command Prompt(with Azure CLI logged in) and paste the following command

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml env setup — cluster -n <ENVIRONMENT_NAME> -l <AZURE_REGION e.g. eastus2> [-g <RESOURCE_GROUP>]

Eg:

az ml env setup — cluster -n sgenvml -l eastus2 -g sgresourcegrpml

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Respond with NO to the question about Reuse storage and ACR (Y/n)? This sets up a new AKS cluster with Kubernetes as the orchestrator with environment name and resource group. The cluster environment setup command creates the following resources in our subscription.

**Note: This process of cluster formation is quite time consuming and it can take up to 15–20min. Meanwhile you can check the status of ongoing cluster provisioning with the command

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml env show -n <ENVIRONMENT_NAME> -g <RESOURCE_GROUP>

Eg:

az ml env show -n sgenvml -g sgresourcegrpml

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Step 2

Now we need to check whether the value “Provisioning State” has changed from “Creating” to “Succeeded”. Once the status changed set the environment with

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml env set -n <ENVIRONMENT_NAME> -g <RESOURCE_GROUP>

Step 3

Setting up a model management account using 1 instances.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml account modelmanagement create -l <AZURE_REGION e.g. eastus2> -n <ACCOUNT_NAME> -g <RESOURCE_GROUP> — sku-instances <NUMBER_OF_INSTANCES, e.g. 1> — sku-name <PRICING_TIER for example S1>

Eg:

az ml account modelmanagement create -l eastus2 -n sgaccountml -g sgresourcegrpml — sku-instances 1 — sku-name S1

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

**we have used here S1 tier storage set. This can be scalable according to the requirement.

Step 4

Setting up the model management account we have created

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml account modelmanagement set -n <ACCOUNT_NAME> -g <RESOURCE_GROUP>

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

Step 5

The most important step is to deploy the saved model as a web service, we execute the below command

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml service create realtime — model-file <MODEL_FILE_RELATIVE_PATH> -f <SCORING_FILE e.g. score.py> -n <SERVICE_NAME> -s <SCHEMA_FILE e.g. service_schema.json> -r <DOCKER_RUNTIME e.g. spark-py or python>

Eg:

az ml service create realtime -f score_iris.py — model-file model.pkl -s service_schema.json -n sgprojectml -r python -c aml_config\conda_dependencies.yml

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

There can be two kinds of error which can occur in this step while creating the service.

1.Unable to the find the path of the model.pkl file and service_schema.json.

Model.pkl is generated in ./output of the workbench. We need to copy the downloaded model.pkl file and paste it in the root folder shown below.

Similarly running the score_iris.py we can generate the service schema.json. Once generated copy the both files in the work folder and run the Step 5 command again.

This Step 5 command successfully creates the manifest file and the image with service ID. You can check the effects from azure portal under resource group.

Once everything is set up you are going to get some additional message like this Additional usage information: ‘az ml service usage realtime -i sgprojectml.sgenvml-e1939343.eastus2’. enter the command to get the scoring url.

Output

Scoring URL:

http://<ipadress>/api/v1/service/sgprojectml/score

Headers:

Content-Type: application/json

Authorization: Bearer <key>

(<key> can be found by running ‘az ml service keys realtime -i sgprojectml.sgenvml-e1939343.eastus2’)Swagger URL:

http://<ipadress>api/v1/service/sgprojectml/swagger.json

Sample CLI command:

Usage for cmd: az ml service run realtime -i sgprojectml.sgenvml-e1939343.eastus2 -d “{\”input_df\”: [{\”sepal width\”: 3.6, \”petal width\”: 0.25, \”sepal length\”: 3.0, \”petal length\”: 1.3}]}”

Usage for powershell: az ml service run realtime -i sgprojectml.sgenvml-e1939343.eastus2 — % -d “{\”input_df\”: [{\”sepal width\”: 3.6, \”petal width\”: 0.25, \”sepal length\”: 3.0, \”petal length\”: 1.3}]}”

Sample CURL call:

curl -X POST -H “Content-Type:application/json” -H “Authorization:Bearer <key>” — data “{\”input_df\”: [{\”sepal width\”: 3.6, \”petal width\”: 0.25, \”sepal length\”: 3.0, \”petal length\”: 1.3}]}” http://52.179.212.90/api/v1/service/sgprojectml/score

Step 6

The URL will work on Jupyter Notebook once the primary and secondary key is known. Hit the following command to generate the id. Once the keys are set in Jupyter Notebook you can easily access the url through k8 clusters.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

az ml service keys realtime -i sgprojectml.sgenvml-e1939343.eastus2

Primary Key: U9OjgUiP****

Secondary Key: crkQQGQY***

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

**sgprojectml.sgenvml is my project and env name.

After successful execution of all these steps you can now access the ML model as web service through ACS Cluster

Cloud, Data & Analytics, DevOps