From 4203d4f33ca79dbb712759cd75a68268e530be1a Mon Sep 17 00:00:00 2001 From: mart-r Date: Fri, 21 Feb 2025 12:27:56 +0000 Subject: [PATCH] Add optional workaround for running with mlflow component --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index c737392..fc6093a 100644 --- a/README.md +++ b/README.md @@ -88,6 +88,8 @@ To serve NLP models through a container, run the following commands: export MODEL_PACKAGE_FULL_PATH= export CMS_UID=$(id -u $USER) export CMS_GID=$(id -g $USER) +# NOTE: use if you wish to save models locally (i.e run without the mlflow component) +# export export MLFLOW_TRACKING_URI="file:///tmp/mlruns/" docker compose -f docker-compose.yml up -d ``` Then the API docs will be accessible at localhost on the mapped port specified in `docker-compose.yml`. The container runs