Skip to content

Commit 113281d

Browse files
Update path for finetuning (#1306)
Signed-off-by: Ye, Xinyu <xinyu.ye@intel.com> Signed-off-by: chensuyue <suyue.chen@intel.com>
1 parent 370d692 commit 113281d

File tree

10 files changed

+11
-11
lines changed

10 files changed

+11
-11
lines changed

InstructionTuning/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ curl http://${your_ip}:8015/v1/fine_tuning/jobs \
3838
}'
3939
```
4040

41-
The outputs of the finetune job (adapter_model.safetensors, adapter_config,json... ) are stored in `/home/user/comps/finetuning/output` and other execution logs are stored in `/home/user/ray_results`
41+
The outputs of the finetune job (adapter_model.safetensors, adapter_config,json... ) are stored in `/home/user/comps/finetuning/src/output` and other execution logs are stored in `/home/user/ray_results`
4242

4343
### 3. Manage fine-tuning job
4444

InstructionTuning/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Build docker image with below command:
1414
git clone https://github.com/opea-project/GenAIComps.git
1515
cd GenAIComps
1616
export HF_TOKEN=${your_huggingface_token}
17-
docker build -t opea/finetuning:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy --build-arg HF_TOKEN=$HF_TOKEN -f comps/finetuning/Dockerfile .
17+
docker build -t opea/finetuning:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy --build-arg HF_TOKEN=$HF_TOKEN -f comps/finetuning/src/Dockerfile .
1818
```
1919

2020
### 2. Run Docker with CLI

InstructionTuning/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Build docker image with below command:
1313
```bash
1414
git clone https://github.com/opea-project/GenAIComps.git
1515
cd GenAIComps
16-
docker build -t opea/finetuning-gaudi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/finetuning/Dockerfile.intel_hpu .
16+
docker build -t opea/finetuning-gaudi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/finetuning/src/Dockerfile.intel_hpu .
1717
```
1818

1919
### 2. Run Docker with CLI

InstructionTuning/docker_image_build/build.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,5 +9,5 @@ services:
99
https_proxy: ${https_proxy}
1010
no_proxy: ${no_proxy}
1111
context: GenAIComps
12-
dockerfile: comps/finetuning/Dockerfile
12+
dockerfile: comps/finetuning/src/Dockerfile
1313
image: ${REGISTRY:-opea}/finetuning:${TAG:-latest}

InstructionTuning/tests/test_compose_on_xeon.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ ray_port=8265
1919
function build_docker_images() {
2020
cd $WORKPATH/docker_image_build
2121
if [ ! -d "GenAIComps" ] ; then
22-
git clone https://github.com/opea-project/GenAIComps.git
22+
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
2323
fi
2424
docker compose -f build.yaml build --no-cache > ${LOG_PATH}/docker_image_build.log
2525
}

RerankFinetuning/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Build docker image with below command:
1414
git clone https://github.com/opea-project/GenAIComps.git
1515
cd GenAIComps
1616
export HF_TOKEN=${your_huggingface_token}
17-
docker build -t opea/finetuning:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy --build-arg HF_TOKEN=$HF_TOKEN -f comps/finetuning/Dockerfile .
17+
docker build -t opea/finetuning:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy --build-arg HF_TOKEN=$HF_TOKEN -f comps/finetuning/src/Dockerfile .
1818
```
1919

2020
### 2. Run Docker with CLI

RerankFinetuning/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Build docker image with below command:
1313
```bash
1414
git clone https://github.com/opea-project/GenAIComps.git
1515
cd GenAIComps
16-
docker build -t opea/finetuning-gaudi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/finetuning/Dockerfile.intel_hpu .
16+
docker build -t opea/finetuning-gaudi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/finetuning/src/Dockerfile.intel_hpu .
1717
```
1818

1919
### 2. Run Docker with CLI

RerankFinetuning/docker_image_build/build.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,5 +9,5 @@ services:
99
https_proxy: ${https_proxy}
1010
no_proxy: ${no_proxy}
1111
context: GenAIComps
12-
dockerfile: comps/finetuning/Dockerfile
12+
dockerfile: comps/finetuning/src/Dockerfile
1313
image: ${REGISTRY:-opea}/finetuning:${TAG:-latest}

RerankFinetuning/tests/test_compose_on_xeon.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ray_port=8265
1818
function build_docker_images() {
1919
cd $WORKPATH/docker_image_build
2020
if [ ! -d "GenAIComps" ] ; then
21-
git clone https://github.com/opea-project/GenAIComps.git
21+
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
2222
fi
2323
docker compose -f build.yaml build --no-cache > ${LOG_PATH}/docker_image_build.log
2424
}

docker_images_list.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -61,8 +61,8 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the
6161
| [opea/embedding-multimodal-bridgetower](https://hub.docker.com/r/opea/embedding-multimodal-bridgetower) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/embeddings/src/integrations/dependency/bridgetower/Dockerfile) | The docker image exposes OPEA multimodal embedded microservices based on bridgetower for use by GenAI applications |
6262
| [opea/embedding-multimodal-bridgetower-gaudi](https://hub.docker.com/r/opea/embedding-multimodal-bridgetower-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/embeddings/src/integrations/dependency/bridgetower/Dockerfile.intel_hpu) | The docker image exposes OPEA multimodal embedded microservices based on bridgetower for use by GenAI applications on the Gaudi |
6363
| [opea/feedbackmanagement](https://hub.docker.com/r/opea/feedbackmanagement) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/feedback_management/src/Dockerfile) | The docker image exposes that the OPEA feedback management microservice uses a MongoDB database for GenAI applications. |
64-
| [opea/finetuning](https://hub.docker.com/r/opea/finetuning) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/Dockerfile) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use |
65-
| [opea/finetuning-gaudi](https://hub.docker.com/r/opea/finetuning-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/Dockerfile.intel_hpu) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use on the Gaudi |
64+
| [opea/finetuning](https://hub.docker.com/r/opea/finetuning) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/src/Dockerfile) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use |
65+
| [opea/finetuning-gaudi](https://hub.docker.com/r/opea/finetuning-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/src/Dockerfile.intel_hpu) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use on the Gaudi |
6666
| [opea/gmcrouter](https://hub.docker.com/r/opea/gmcrouter) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.manager) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC |
6767
| [opea/gmcmanager](https://hub.docker.com/r/opea/gmcmanager) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.router) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD |
6868
| [opea/guardrails]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/guardrails/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use |

0 commit comments

Comments
 (0)