Skip to content

Commit

Permalink
Add optional workaround for running with mlflow component
Browse files Browse the repository at this point in the history
  • Loading branch information
mart-r committed Feb 21, 2025
1 parent 3a76ea0 commit 4203d4f
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,8 @@ To serve NLP models through a container, run the following commands:
export MODEL_PACKAGE_FULL_PATH=<PATH/TO/MODEL_PACKAGE.zip>
export CMS_UID=$(id -u $USER)
export CMS_GID=$(id -g $USER)
# NOTE: use if you wish to save models locally (i.e run without the mlflow component)
# export export MLFLOW_TRACKING_URI="file:///tmp/mlruns/"
docker compose -f docker-compose.yml up -d <model-service>
```
Then the API docs will be accessible at localhost on the mapped port specified in `docker-compose.yml`. The container runs
Expand Down

0 comments on commit 4203d4f

Please sign in to comment.