Skip to content

Commit 03f1aea

Browse files
committed
Update code and readme for GenAIComps Refactor
Signed-off-by: lvliang-intel <liang1.lv@intel.com>
1 parent e860a9a commit 03f1aea

File tree

42 files changed

+78
-80
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+78
-80
lines changed

AudioQnA/docker_compose/amd/gpu/rocm/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg
2424
### 3. Build LLM Image
2525

2626
```bash
27-
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
27+
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2828
```
2929

3030
Note:

AudioQnA/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg
2323
### 3. Build LLM Image
2424

2525
```bash
26-
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
26+
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2727
```
2828

2929
### 4. Build TTS Image

AudioQnA/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg
2323
### 3. Build LLM Image
2424

2525
```bash
26-
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
26+
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2727
```
2828

2929
### 4. Build TTS Image

AudioQnA/docker_image_build/build.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ services:
4444
llm-tgi:
4545
build:
4646
context: GenAIComps
47-
dockerfile: comps/llms/text-generation/tgi/Dockerfile
47+
dockerfile: comps/llms/src/text-generation/Dockerfile
4848
extends: audioqna
4949
image: ${REGISTRY:-opea}/llm-tgi:${TAG:-latest}
5050
speecht5-gaudi:

AvatarChatbot/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg
2323
### 3. Build LLM Image
2424

2525
```bash
26-
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
26+
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2727
```
2828

2929
### 4. Build TTS Image

AvatarChatbot/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ docker build -t opea/asr:latest --build-arg https_proxy=$https_proxy --build-arg
2323
### 3. Build LLM Image
2424

2525
```bash
26-
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
26+
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2727
```
2828

2929
### 4. Build TTS Image

AvatarChatbot/docker_image_build/build.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ services:
3232
llm-tgi:
3333
build:
3434
context: GenAIComps
35-
dockerfile: comps/llms/text-generation/tgi/Dockerfile
35+
dockerfile: comps/llms/src/text-generation/Dockerfile
3636
extends: avatarchatbot
3737
image: ${REGISTRY:-opea}/llm-tgi:${TAG:-latest}
3838
speecht5-gaudi:

ChatQnA/docker_compose/amd/gpu/rocm/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -138,7 +138,7 @@ cd ../../../..
138138

139139
```bash
140140
cd GenAIComps
141-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
141+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
142142
```
143143

144144
Then run the command `docker images`, you will have the following 5 Docker Images:

ChatQnA/docker_compose/intel/cpu/aipc/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ docker build --no-cache -t opea/chatqna-ui:latest --build-arg https_proxy=$https
5555

5656
```bash
5757
cd GenAIComps
58-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
58+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
5959
```
6060

6161
Then run the command `docker images`, you will have the following 6 Docker Images:

ChatQnA/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -161,7 +161,7 @@ docker build --no-cache -t opea/chatqna-conversation-ui:latest --build-arg https
161161

162162
```bash
163163
cd GenAIComps
164-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
164+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
165165
```
166166

167167
Then run the command `docker images`, you will have the following 5 Docker Images:

ChatQnA/docker_compose/intel/cpu/xeon/README_pinecone.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -164,7 +164,7 @@ docker build --no-cache -t opea/chatqna-conversation-ui:latest --build-arg https
164164

165165
```bash
166166
cd GenAIComps
167-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
167+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
168168
```
169169

170170
Then run the command `docker images`, you will have the following 5 Docker Images:

ChatQnA/docker_compose/intel/cpu/xeon/README_qdrant.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ cd ../../../..
122122

123123
```bash
124124
cd GenAIComps
125-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
125+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
126126
```
127127

128128
Then run the command `docker images`, you will have the following 5 Docker Images:

ChatQnA/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,7 @@ docker build --no-cache -t opea/chatqna-conversation-ui:latest --build-arg https
151151

152152
```bash
153153
cd GenAIComps
154-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
154+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
155155
```
156156

157157
Then run the command `docker images`, you will have the following 5 Docker Images:

ChatQnA/docker_compose/nvidia/gpu/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,7 @@ cd ../../..
148148

149149
```bash
150150
cd GenAIComps
151-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
151+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
152152
```
153153

154154
Then run the command `docker images`, you will have the following 5 Docker Images:

ChatQnA/docker_image_build/build.yaml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ services:
4444
embedding-tei:
4545
build:
4646
context: GenAIComps
47-
dockerfile: comps/embeddings/tei/langchain/Dockerfile
47+
dockerfile: comps/embeddings/src/Dockerfile
4848
extends: chatqna
4949
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
5050
retriever-redis:
@@ -68,25 +68,25 @@ services:
6868
reranking-tei:
6969
build:
7070
context: GenAIComps
71-
dockerfile: comps/reranks/tei/Dockerfile
71+
dockerfile: comps/reranks/src/Dockerfile
7272
extends: chatqna
7373
image: ${REGISTRY:-opea}/reranking-tei:${TAG:-latest}
7474
llm-tgi:
7575
build:
7676
context: GenAIComps
77-
dockerfile: comps/llms/text-generation/tgi/Dockerfile
77+
dockerfile: comps/llms/src/text-generation/Dockerfile
7878
extends: chatqna
7979
image: ${REGISTRY:-opea}/llm-tgi:${TAG:-latest}
8080
llm-ollama:
8181
build:
8282
context: GenAIComps
83-
dockerfile: comps/llms/text-generation/ollama/langchain/Dockerfile
83+
dockerfile: comps/llms/src/text-generation/Dockerfile
8484
extends: chatqna
8585
image: ${REGISTRY:-opea}/llm-ollama:${TAG:-latest}
8686
llm-vllm:
8787
build:
8888
context: GenAIComps
89-
dockerfile: comps/llms/text-generation/vllm/langchain/Dockerfile
89+
dockerfile: comps/llms/src/text-generation/Dockerfile
9090
extends: chatqna
9191
image: ${REGISTRY:-opea}/llm-vllm:${TAG:-latest}
9292
dataprep-redis:
@@ -128,6 +128,6 @@ services:
128128
nginx:
129129
build:
130130
context: GenAIComps
131-
dockerfile: comps/nginx/Dockerfile
131+
dockerfile: comps/3rd_parties/nginx/deployment/docker/Dockerfile
132132
extends: chatqna
133133
image: ${REGISTRY:-opea}/nginx:${TAG:-latest}

CodeGen/docker_compose/amd/gpu/rocm/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ git clone https://github.com/opea-project/GenAIComps.git
1010
cd GenAIComps
1111

1212
### Build Docker image
13-
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
13+
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
1414
```
1515

1616
### Build the MegaService Docker Image

CodeGen/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ Should the Docker image you seek not yet be available on Docker Hub, you can bui
1919
```bash
2020
git clone https://github.com/opea-project/GenAIComps.git
2121
cd GenAIComps
22-
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
22+
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2323
```
2424

2525
### 2. Build the MegaService Docker Image

CodeGen/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ First of all, you need to build the Docker images locally. This step can be igno
1111
```bash
1212
git clone https://github.com/opea-project/GenAIComps.git
1313
cd GenAIComps
14-
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
14+
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
1515
```
1616

1717
### 2. Build the MegaService Docker Image

CodeGen/docker_image_build/build.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,6 @@ services:
2626
llm-tgi:
2727
build:
2828
context: GenAIComps
29-
dockerfile: comps/llms/text-generation/tgi/Dockerfile
29+
dockerfile: comps/llms/src/text-generation/Dockerfile
3030
extends: codegen
3131
image: ${REGISTRY:-opea}/llm-tgi:${TAG:-latest}

CodeTrans/docker_compose/amd/gpu/rocm/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ git clone https://github.com/opea-project/GenAIComps.git
1010
cd GenAIComps
1111

1212
### Build Docker image
13-
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
13+
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
1414
```
1515

1616
### Build the MegaService Docker Image

CodeTrans/docker_compose/intel/cpu/xeon/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ First of all, you need to build Docker Images locally and install the python pac
1919
```bash
2020
git clone https://github.com/opea-project/GenAIComps.git
2121
cd GenAIComps
22-
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
22+
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
2323
```
2424

2525
### 2. Build MegaService Docker Image
@@ -41,7 +41,7 @@ docker build -t opea/codetrans-ui:latest --build-arg https_proxy=$https_proxy --
4141

4242
```bash
4343
cd GenAIComps
44-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
44+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
4545
```
4646

4747
Then run the command `docker images`, you will have the following Docker Images:

CodeTrans/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ First of all, you need to build Docker Images locally and install the python pac
1111
```bash
1212
git clone https://github.com/opea-project/GenAIComps.git
1313
cd GenAIComps
14-
docker build -t opea/llm-tgi:latest --no-cache --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
14+
docker build -t opea/llm-tgi:latest --no-cache --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
1515
```
1616

1717
### 2. Build MegaService Docker Image
@@ -33,7 +33,7 @@ docker build -t opea/codetrans-ui:latest --build-arg https_proxy=$https_proxy --
3333

3434
```bash
3535
cd GenAIComps
36-
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/nginx/Dockerfile .
36+
docker build -t opea/nginx:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/3rd_parties/nginx/deployment/docker/Dockerfile .
3737
```
3838

3939
Then run the command `docker images`, you will have the following Docker Images:

CodeTrans/docker_image_build/build.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,12 +20,12 @@ services:
2020
llm-tgi:
2121
build:
2222
context: GenAIComps
23-
dockerfile: comps/llms/text-generation/tgi/Dockerfile
23+
dockerfile: comps/llms/src/text-generation/Dockerfile
2424
extends: codetrans
2525
image: ${REGISTRY:-opea}/llm-tgi:${TAG:-latest}
2626
nginx:
2727
build:
2828
context: GenAIComps
29-
dockerfile: comps/nginx/Dockerfile
29+
dockerfile: comps/3rd_parties/nginx/deployment/docker/Dockerfile
3030
extends: codetrans
3131
image: ${REGISTRY:-opea}/nginx:${TAG:-latest}

DocIndexRetriever/docker_compose/intel/cpu/xeon/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m
99
```bash
1010
git clone https://github.com/opea-project/GenAIComps.git
1111
cd GenAIComps
12-
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/tei/langchain/Dockerfile .
12+
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
1313
```
1414

1515
- Retriever Vector store Image
@@ -21,7 +21,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m
2121
- Rerank TEI Image
2222

2323
```bash
24-
docker build -t opea/reranking-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/tei/Dockerfile .
24+
docker build -t opea/reranking-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/src/Dockerfile .
2525
```
2626

2727
- Dataprep Image

DocIndexRetriever/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m
99
```bash
1010
git clone https://github.com/opea-project/GenAIComps.git
1111
cd GenAIComps
12-
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/tei/langchain/Dockerfile .
12+
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
1313
```
1414

1515
- Retriever Vector store Image
@@ -21,7 +21,7 @@ DocRetriever are the most widely adopted use case for leveraging the different m
2121
- Rerank TEI Image
2222

2323
```bash
24-
docker build -t opea/reranking-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/tei/Dockerfile .
24+
docker build -t opea/reranking-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/src/Dockerfile .
2525
```
2626

2727
- Dataprep Image

DocIndexRetriever/docker_image_build/build.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ services:
1414
embedding-tei:
1515
build:
1616
context: GenAIComps
17-
dockerfile: comps/embeddings/tei/langchain/Dockerfile
17+
dockerfile: comps/embeddings/src/Dockerfile
1818
extends: doc-index-retriever
1919
image: ${REGISTRY:-opea}/embedding-tei:${TAG:-latest}
2020
retriever-redis:
@@ -26,7 +26,7 @@ services:
2626
reranking-tei:
2727
build:
2828
context: GenAIComps
29-
dockerfile: comps/reranks/tei/Dockerfile
29+
dockerfile: comps/reranks/src/Dockerfile
3030
extends: doc-index-retriever
3131
image: ${REGISTRY:-opea}/reranking-tei:${TAG:-latest}
3232
dataprep-redis:

FaqGen/docker_compose/amd/gpu/rocm/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ git clone https://github.com/opea-project/GenAIComps.git
1010
cd GenAIComps
1111

1212
### Build Docker image
13-
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
13+
docker build -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
1414
```
1515

1616
## 🚀 Start Microservices and MegaService

GraphRAG/docker_image_build/build.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ services:
3636
https_proxy: ${https_proxy}
3737
no_proxy: ${no_proxy}
3838
context: GenAIComps
39-
dockerfile: comps/nginx/Dockerfile
39+
dockerfile: comps/3rd_parties/nginx/deployment/docker/Dockerfile
4040
image: ${REGISTRY:-opea}/nginx:${TAG:-latest}
4141
graphrag-ui:
4242
build:

MultimodalQnA/docker_compose/amd/gpu/rocm/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,13 +25,13 @@ Build embedding-multimodal-bridgetower docker image
2525
```bash
2626
git clone https://github.com/opea-project/GenAIComps.git
2727
cd GenAIComps
28-
docker build --no-cache -t opea/embedding-multimodal-bridgetower:latest --build-arg EMBEDDER_PORT=$EMBEDDER_PORT --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/multimodal/bridgetower/Dockerfile .
28+
docker build --no-cache -t opea/embedding-multimodal-bridgetower:latest --build-arg EMBEDDER_PORT=$EMBEDDER_PORT --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/integrations/dependency/bridgetower/Dockerfile .
2929
```
3030

3131
Build embedding-multimodal microservice image
3232

3333
```bash
34-
docker build --no-cache -t opea/embedding-multimodal:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/multimodal/multimodal_langchain/Dockerfile .
34+
docker build --no-cache -t opea/embedding-multimodal:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
3535
```
3636

3737
### 2. Build LVM Images

ProductivitySuite/docker_compose/intel/cpu/xeon/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ First of all, you need to build Docker Images locally and install the python pac
1313
```bash
1414
git clone https://github.com/opea-project/GenAIComps.git
1515
cd GenAIComps
16-
docker build --no-cache -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/tei/langchain/Dockerfile .
16+
docker build --no-cache -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/src/Dockerfile .
1717
```
1818

1919
### 2. Build Retriever Image
@@ -25,15 +25,15 @@ docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$
2525
### 3. Build Rerank Image
2626

2727
```bash
28-
docker build --no-cache -t opea/reranking-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/tei/Dockerfile .
28+
docker build --no-cache -t opea/reranking-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/src/Dockerfile .
2929
```
3030

3131
### 4. Build LLM Image
3232

3333
#### Use TGI as backend
3434

3535
```bash
36-
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/text-generation/tgi/Dockerfile .
36+
docker build --no-cache -t opea/llm-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
3737
```
3838

3939
### 5. Build Dataprep Image

0 commit comments

Comments
 (0)