Skip to content

Commit ed2b8ed

Browse files
Exclude dockerfile under tests and exclude check Dockerfile under tests. (#1354)
Signed-off-by: ZePan110 <ze.pan@intel.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent a6e702e commit ed2b8ed

File tree

35 files changed

+106
-107
lines changed

35 files changed

+106
-107
lines changed

.github/workflows/pr-dockerfile-path-and-build-yaml-scan.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ jobs:
6060
shopt -s globstar
6161
no_add="FALSE"
6262
cd ${{github.workspace}}
63-
Dockerfiles=$(realpath $(find ./ -name '*Dockerfile*'))
63+
Dockerfiles=$(realpath $(find ./ -name '*Dockerfile*' ! -path './tests/*'))
6464
if [ -n "$Dockerfiles" ]; then
6565
for dockerfile in $Dockerfiles; do
6666
service=$(echo "$dockerfile" | awk -F '/GenAIExamples/' '{print $2}' | awk -F '/' '{print $2}')

AgentQnA/tests/step1_build_images.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ function build_docker_images_for_retrieval_tool(){
2121
# git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
2222
get_genai_comps
2323
echo "Build all the images with --no-cache..."
24-
service_list="doc-index-retriever dataprep-redis embedding-tei retriever-redis reranking-tei"
24+
service_list="doc-index-retriever dataprep-redis embedding retriever-redis reranking-tei"
2525
docker compose -f build.yaml build ${service_list} --no-cache
2626
docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
2727

ChatQnA/docker_compose/intel/hpu/gaudi/how_to_validate_service.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ Here is the output:
4343
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
4444
28d9a5570246 opea/chatqna-ui:latest "docker-entrypoint.s…" 2 minutes ago Up 2 minutes 0.0.0.0:5173->5173/tcp, :::5173->5173/tcp chatqna-gaudi-ui-server
4545
bee1132464cd opea/chatqna:latest "python chatqna.py" 2 minutes ago Up 2 minutes 0.0.0.0:8888->8888/tcp, :::8888->8888/tcp chatqna-gaudi-backend-server
46-
f810f3b4d329 opea/embedding-tei:latest "python embedding_te…" 2 minutes ago Up 2 minutes 0.0.0.0:6000->6000/tcp, :::6000->6000/tcp embedding-tei-server
46+
f810f3b4d329 opea/embedding:latest "python embedding_te…" 2 minutes ago Up 2 minutes 0.0.0.0:6000->6000/tcp, :::6000->6000/tcp embedding-server
4747
325236a01f9b opea/llm-textgen:latest "python llm.py" 2 minutes ago Up 2 minutes 0.0.0.0:9000->9000/tcp, :::9000->9000/tcp llm-textgen-gaudi-server
4848
2fa17d84605f opea/dataprep-redis:latest "python prepare_doc_…" 2 minutes ago Up 2 minutes 0.0.0.0:6007->6007/tcp, :::6007->6007/tcp dataprep-redis-server
4949
69e1fb59e92c opea/retriever-redis:latest "/home/user/comps/re…" 2 minutes ago Up 2 minutes 0.0.0.0:7000->7000/tcp, :::7000->7000/tcp retriever-redis-server

ChatQnA/docker_image_build/build.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -41,12 +41,12 @@ services:
4141
dockerfile: ./docker/Dockerfile.react
4242
extends: chatqna
4343
image: ${REGISTRY:-opea}/chatqna-conversation-ui:${TAG:-latest}
44-
embedding-tei:
44+
embedding:
4545
build:
4646
context: GenAIComps
4747
dockerfile: comps/embeddings/src/Dockerfile
4848
extends: chatqna
49-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
49+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
5050
retriever-redis:
5151
build:
5252
context: GenAIComps

DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m
99
```bash
1010
git clone https://github.com/opea-project/GenAIComps.git
1111
cd GenAIComps
12-
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
12+
docker build -t opea/embedding:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
1313
```
1414

1515
- Retriever Vector store Image
@@ -125,7 +125,7 @@ curl http://${host_ip}:8889/v1/retrievaltool -X POST -H "Content-Type: applicati
125125
-X POST \
126126
-d '{"text":"Explain the OPEA project"}' \
127127
-H 'Content-Type: application/json' > query
128-
docker container logs embedding-tei-server
128+
docker container logs embedding-server
129129

130130
# if you used tei-gaudi
131131
docker container logs tei-embedding-gaudi-server

DocIndexRetriever/docker_compose/intel/cpu/xeon/compose.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,8 +50,8 @@ services:
5050
timeout: 10s
5151
retries: 60
5252
embedding:
53-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
54-
container_name: embedding-tei-server
53+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
54+
container_name: embedding-server
5555
ports:
5656
- "6000:6000"
5757
ipc: host

DocIndexRetriever/docker_compose/intel/cpu/xeon/compose_without_rerank.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,8 +50,8 @@ services:
5050
timeout: 10s
5151
retries: 60
5252
embedding:
53-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
54-
container_name: embedding-tei-server
53+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
54+
container_name: embedding-server
5555
ports:
5656
- "6000:6000"
5757
ipc: host

DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m
99
```bash
1010
git clone https://github.com/opea-project/GenAIComps.git
1111
cd GenAIComps
12-
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
12+
docker build -t opea/embedding:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
1313
```
1414

1515
- Retriever Vector store Image
@@ -115,7 +115,7 @@ curl http://${host_ip}:8889/v1/retrievaltool -X POST -H "Content-Type: applicati
115115
-X POST \
116116
-d '{"text":"Explain the OPEA project"}' \
117117
-H 'Content-Type: application/json' > query
118-
docker container logs embedding-tei-server
118+
docker container logs embedding-server
119119

120120
# if you used tei-gaudi
121121
docker container logs tei-embedding-gaudi-server

DocIndexRetriever/docker_compose/intel/hpu/gaudi/compose.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,8 +55,8 @@ services:
5555
timeout: 10s
5656
retries: 60
5757
embedding:
58-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
59-
container_name: embedding-tei-server
58+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
59+
container_name: embedding-server
6060
ports:
6161
- "6000:6000"
6262
ipc: host

DocIndexRetriever/docker_image_build/build.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,12 +11,12 @@ services:
1111
context: ../
1212
dockerfile: ./Dockerfile
1313
image: ${REGISTRY:-opea}/doc-index-retriever:${TAG:-latest}
14-
embedding-tei:
14+
embedding:
1515
build:
1616
context: GenAIComps
1717
dockerfile: comps/embeddings/src/Dockerfile
1818
extends: doc-index-retriever
19-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
19+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
2020
retriever-redis:
2121
build:
2222
context: GenAIComps

DocIndexRetriever/tests/test_compose_on_xeon.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ function build_docker_images() {
2121
echo "Cloning GenAIComps repository"
2222
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
2323
fi
24-
service_list="dataprep-redis embedding-tei retriever-redis reranking-tei doc-index-retriever"
24+
service_list="dataprep-redis embedding retriever-redis reranking-tei doc-index-retriever"
2525
docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log
2626

2727
docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
@@ -98,7 +98,7 @@ function validate_megaservice() {
9898
echo "return value is $EXIT_CODE"
9999
if [ "$EXIT_CODE" == "1" ]; then
100100
echo "=============Embedding container log=================="
101-
docker logs embedding-tei-server | tee -a ${LOG_PATH}/doc-index-retriever-service-xeon.log
101+
docker logs embedding-server | tee -a ${LOG_PATH}/doc-index-retriever-service-xeon.log
102102
echo "=============Retriever container log=================="
103103
docker logs retriever-redis-server | tee -a ${LOG_PATH}/doc-index-retriever-service-xeon.log
104104
echo "=============TEI Reranking log=================="

DocIndexRetriever/tests/test_compose_without_rerank_on_xeon.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ function build_docker_images() {
2121
echo "Cloning GenAIComps repository"
2222
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
2323
fi
24-
service_list="dataprep-redis embedding-tei retriever-redis doc-index-retriever"
24+
service_list="dataprep-redis embedding retriever-redis doc-index-retriever"
2525
docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log
2626

2727
docker pull ghcr.io/huggingface/text-embeddings-inference:cpu-1.5
@@ -92,7 +92,7 @@ function validate_megaservice() {
9292
echo "return value is $EXIT_CODE"
9393
if [ "$EXIT_CODE" == "1" ]; then
9494
echo "=============Embedding container log=================="
95-
docker logs embedding-tei-server | tee -a ${LOG_PATH}/doc-index-retriever-service-xeon.log
95+
docker logs embedding-server | tee -a ${LOG_PATH}/doc-index-retriever-service-xeon.log
9696
echo "=============Retriever container log=================="
9797
docker logs retriever-redis-server | tee -a ${LOG_PATH}/doc-index-retriever-service-xeon.log
9898
echo "=============Doc-index-retriever container log=================="

MultimodalQnA/README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -100,12 +100,12 @@ In the below, we provide a table that describes for each microservice component
100100

101101
By default, the embedding and LVM models are set to a default value as listed below:
102102

103-
| Service | HW | Model |
104-
| ------------- | ----- | ----------------------------------------- |
105-
| embedding-tei | Xeon | BridgeTower/bridgetower-large-itm-mlm-itc |
106-
| LVM | Xeon | llava-hf/llava-1.5-7b-hf |
107-
| embedding-tei | Gaudi | BridgeTower/bridgetower-large-itm-mlm-itc |
108-
| LVM | Gaudi | llava-hf/llava-v1.6-vicuna-13b-hf |
103+
| Service | HW | Model |
104+
| --------- | ----- | ----------------------------------------- |
105+
| embedding | Xeon | BridgeTower/bridgetower-large-itm-mlm-itc |
106+
| LVM | Xeon | llava-hf/llava-1.5-7b-hf |
107+
| embedding | Gaudi | BridgeTower/bridgetower-large-itm-mlm-itc |
108+
| LVM | Gaudi | llava-hf/llava-v1.6-vicuna-13b-hf |
109109

110110
You can choose other LVM models, such as `llava-hf/llava-1.5-7b-hf ` and `llava-hf/llava-1.5-13b-hf`, as needed.
111111

MultimodalQnA/docker_compose/amd/gpu/rocm/README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,10 @@ cd GenAIComps
2828
docker build --no-cache -t opea/embedding-multimodal-bridgetower:latest --build-arg EMBEDDER_PORT=$EMBEDDER_PORT --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/integrations/dependency/bridgetower/Dockerfile .
2929
```
3030

31-
Build embedding-tei microservice image
31+
Build embedding microservice image
3232

3333
```bash
34-
docker build --no-cache -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
34+
docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
3535
```
3636

3737
### 2. Build LVM Images
@@ -87,7 +87,7 @@ Then run the command `docker images`, you will have the following 8 Docker Image
8787
2. `ghcr.io/huggingface/text-generation-inference:2.4.1-rocm`
8888
3. `opea/lvm-tgi:latest`
8989
4. `opea/retriever-multimodal-redis:latest`
90-
5. `opea/embedding-tei:latest`
90+
5. `opea/embedding:latest`
9191
6. `opea/embedding-multimodal-bridgetower:latest`
9292
7. `opea/multimodalqna:latest`
9393
8. `opea/multimodalqna-ui:latest`
@@ -98,11 +98,11 @@ Then run the command `docker images`, you will have the following 8 Docker Image
9898

9999
By default, the multimodal-embedding and LVM models are set to a default value as listed below:
100100

101-
| Service | Model |
102-
| ------------- | ------------------------------------------- |
103-
| embedding-tei | BridgeTower/bridgetower-large-itm-mlm-gaudi |
104-
| LVM | llava-hf/llava-1.5-7b-hf |
105-
| LVM | Xkev/Llama-3.2V-11B-cot |
101+
| Service | Model |
102+
| --------- | ------------------------------------------- |
103+
| embedding | BridgeTower/bridgetower-large-itm-mlm-gaudi |
104+
| LVM | llava-hf/llava-1.5-7b-hf |
105+
| LVM | Xkev/Llama-3.2V-11B-cot |
106106

107107
Note:
108108

@@ -158,7 +158,7 @@ curl http://${host_ip}:${EMBEDDER_PORT}/v1/encode \
158158
-d '{"text":"This is example", "img_b64_str": "iVBORw0KGgoAAAANSUhEUgAAAAoAAAAKCAYAAACNMs+9AAAAFUlEQVR42mP8/5+hnoEIwDiqkL4KAcT9GO0U4BxoAAAAAElFTkSuQmCC"}'
159159
```
160160

161-
2. embedding-tei
161+
2. embedding
162162

163163
```bash
164164
curl http://${host_ip}:$MM_EMBEDDING_PORT_MICROSERVICE/v1/embeddings \

MultimodalQnA/docker_compose/amd/gpu/rocm/compose.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -55,9 +55,9 @@ services:
5555
start_period: 30s
5656
entrypoint: ["python", "bridgetower_server.py", "--device", "cpu", "--model_name_or_path", $EMBEDDING_MODEL_ID]
5757
restart: unless-stopped
58-
embedding-tei:
59-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
60-
container_name: embedding-tei
58+
embedding:
59+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
60+
container_name: embedding
6161
depends_on:
6262
embedding-multimodal-bridgetower:
6363
condition: service_healthy
@@ -138,7 +138,7 @@ services:
138138
depends_on:
139139
- redis-vector-db
140140
- dataprep-multimodal-redis
141-
- embedding-tei
141+
- embedding
142142
- retriever-redis
143143
- lvm-tgi
144144
ports:

MultimodalQnA/docker_compose/intel/cpu/xeon/README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ embedding-multimodal-bridgetower
2424
=====================
2525
Port 6006 - Open to 0.0.0.0/0
2626
27-
embedding-tei
27+
embedding
2828
=========
2929
Port 6000 - Open to 0.0.0.0/0
3030
@@ -115,10 +115,10 @@ cd GenAIComps
115115
docker build --no-cache -t opea/embedding-multimodal-bridgetower:latest --build-arg EMBEDDER_PORT=$EMBEDDER_PORT --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/integrations/dependency/bridgetower/Dockerfile .
116116
```
117117

118-
Build embedding-tei microservice image
118+
Build embedding microservice image
119119

120120
```bash
121-
docker build --no-cache -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
121+
docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
122122
```
123123

124124
### 2. Build retriever-multimodal-redis Image
@@ -184,7 +184,7 @@ Then run the command `docker images`, you will have the following 11 Docker Imag
184184
4. `opea/retriever-multimodal-redis:latest`
185185
5. `opea/whisper:latest`
186186
6. `opea/redis-vector-db`
187-
7. `opea/embedding-tei:latest`
187+
7. `opea/embedding:latest`
188188
8. `opea/embedding-multimodal-bridgetower:latest`
189189
9. `opea/multimodalqna:latest`
190190
10. `opea/multimodalqna-ui:latest`
@@ -195,10 +195,10 @@ Then run the command `docker images`, you will have the following 11 Docker Imag
195195

196196
By default, the multimodal-embedding and LVM models are set to a default value as listed below:
197197

198-
| Service | Model |
199-
| ------------- | ------------------------------------------- |
200-
| embedding-tei | BridgeTower/bridgetower-large-itm-mlm-gaudi |
201-
| LVM | llava-hf/llava-1.5-7b-hf |
198+
| Service | Model |
199+
| --------- | ------------------------------------------- |
200+
| embedding | BridgeTower/bridgetower-large-itm-mlm-gaudi |
201+
| LVM | llava-hf/llava-1.5-7b-hf |
202202

203203
### Start all the services Docker Containers
204204

@@ -227,7 +227,7 @@ curl http://${host_ip}:${EMBEDDER_PORT}/v1/encode \
227227
-d '{"text":"This is example", "img_b64_str": "iVBORw0KGgoAAAANSUhEUgAAAAoAAAAKCAYAAACNMs+9AAAAFUlEQVR42mP8/5+hnoEIwDiqkL4KAcT9GO0U4BxoAAAAAElFTkSuQmCC"}'
228228
```
229229

230-
2. embedding-tei
230+
2. embedding
231231

232232
```bash
233233
curl http://${host_ip}:$MM_EMBEDDING_PORT_MICROSERVICE/v1/embeddings \

MultimodalQnA/docker_compose/intel/cpu/xeon/compose.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -55,9 +55,9 @@ services:
5555
start_period: 30s
5656
entrypoint: ["python", "bridgetower_server.py", "--device", "cpu", "--model_name_or_path", $EMBEDDING_MODEL_ID]
5757
restart: unless-stopped
58-
embedding-tei:
59-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
60-
container_name: embedding-tei
58+
embedding:
59+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
60+
container_name: embedding
6161
depends_on:
6262
embedding-multimodal-bridgetower:
6363
condition: service_healthy
@@ -120,7 +120,7 @@ services:
120120
depends_on:
121121
- redis-vector-db
122122
- dataprep-multimodal-redis
123-
- embedding-tei
123+
- embedding
124124
- retriever-redis
125125
- lvm-llava-svc
126126
ports:

MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -66,10 +66,10 @@ cd GenAIComps
6666
docker build --no-cache -t opea/embedding-multimodal-bridgetower:latest --build-arg EMBEDDER_PORT=$EMBEDDER_PORT --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/integrations/dependency/bridgetower/Dockerfile .
6767
```
6868
69-
Build embedding-tei microservice image
69+
Build embedding microservice image
7070
7171
```bash
72-
docker build --no-cache -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
72+
docker build --no-cache -t opea/embedding:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
7373
```
7474
7575
### 2. Build retriever-multimodal-redis Image
@@ -133,7 +133,7 @@ Then run the command `docker images`, you will have the following 11 Docker Imag
133133
4. `opea/retriever-multimodal-redis:latest`
134134
5. `opea/whisper:latest`
135135
6. `opea/redis-vector-db`
136-
7. `opea/embedding-tei:latest`
136+
7. `opea/embedding:latest`
137137
8. `opea/embedding-multimodal-bridgetower:latest`
138138
9. `opea/multimodalqna:latest`
139139
10. `opea/multimodalqna-ui:latest`
@@ -144,10 +144,10 @@ Then run the command `docker images`, you will have the following 11 Docker Imag
144144
145145
By default, the multimodal-embedding and LVM models are set to a default value as listed below:
146146
147-
| Service | Model |
148-
| ------------- | ------------------------------------------- |
149-
| embedding-tei | BridgeTower/bridgetower-large-itm-mlm-gaudi |
150-
| LVM | llava-hf/llava-v1.6-vicuna-13b-hf |
147+
| Service | Model |
148+
| --------- | ------------------------------------------- |
149+
| embedding | BridgeTower/bridgetower-large-itm-mlm-gaudi |
150+
| LVM | llava-hf/llava-v1.6-vicuna-13b-hf |
151151
152152
### Start all the services Docker Containers
153153
@@ -176,7 +176,7 @@ curl http://${host_ip}:${EMBEDDER_PORT}/v1/encode \
176176
-d '{"text":"This is example", "img_b64_str": "iVBORw0KGgoAAAANSUhEUgAAAAoAAAAKCAYAAACNMs+9AAAAFUlEQVR42mP8/5+hnoEIwDiqkL4KAcT9GO0U4BxoAAAAAElFTkSuQmCC"}'
177177
```
178178
179-
2. embedding-tei
179+
2. embedding
180180
181181
```bash
182182
curl http://${host_ip}:$MM_EMBEDDING_PORT_MICROSERVICE/v1/embeddings \

MultimodalQnA/docker_compose/intel/hpu/gaudi/compose.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -55,9 +55,9 @@ services:
5555
start_period: 30s
5656
entrypoint: ["python", "bridgetower_server.py", "--device", "hpu", "--model_name_or_path", $EMBEDDING_MODEL_ID]
5757
restart: unless-stopped
58-
embedding-tei:
59-
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
60-
container_name: embedding-tei
58+
embedding:
59+
image: ${REGISTRY:-opea}/embedding:${TAG:-latest}
60+
container_name: embedding
6161
depends_on:
6262
embedding-multimodal-bridgetower:
6363
condition: service_healthy
@@ -137,7 +137,7 @@ services:
137137
depends_on:
138138
- redis-vector-db
139139
- dataprep-multimodal-redis
140-
- embedding-tei
140+
- embedding
141141
- retriever-redis
142142
- lvm-tgi
143143
ports:

0 commit comments

Comments
 (0)