File tree Expand file tree Collapse file tree 1 file changed +27
-10
lines changed
CodeGen/docker_compose/intel/hpu/gaudi Expand file tree Collapse file tree 1 file changed +27
-10
lines changed Original file line number Diff line number Diff line change @@ -177,27 +177,44 @@ Then run the command `docker images`, you will have the following Docker images:
177
177
- ` opea/codegen-react-ui:latest ` (Optional)
178
178
179
179
### Start the Docker Containers for All Services
180
+ #### Deploy CodeGen on Gaudi
180
181
181
- CodeGen support TGI service and vLLM service, you can choose start either one of them.
182
+ Find the corresponding [ compose.yaml ] ( ./docker_compose/intel/hpu/gaudi/compose.yaml ) . User could start CodeGen based on TGI or vLLM service:
182
183
183
- Start CodeGen based on TGI service:
184
+ ``` bash
185
+ cd GenAIExamples/CodeGen/docker_compose/intel/hpu/gaudi
186
+ ```
184
187
188
+ TGI service:
185
189
``` bash
186
- cd GenAIExamples/CodeGen/docker_compose
187
- source set_env.sh
188
- cd intel/hpu/gaudi
189
190
docker compose --profile codegen-gaudi-tgi up -d
190
191
```
191
192
192
- Start CodeGen based on vLLM service:
193
-
193
+ vLLM service:
194
194
``` bash
195
- cd GenAIExamples/CodeGen/docker_compose
196
- source set_env.sh
197
- cd intel/hpu/gaudi
198
195
docker compose --profile codegen-gaudi-vllm up -d
199
196
```
200
197
198
+ Refer to the [ Gaudi Guide] ( ./docker_compose/intel/hpu/gaudi/README.md ) to build docker images from source.
199
+
200
+ #### Deploy CodeGen on Xeon
201
+
202
+ Find the corresponding [ compose.yaml] ( ./docker_compose/intel/cpu/xeon/compose.yaml ) . User could start CodeGen based on TGI or vLLM service:
203
+
204
+ ``` bash
205
+ cd GenAIExamples/CodeGen/docker_compose/intel/cpu/xeon
206
+ ```
207
+
208
+ TGI service:
209
+ ``` bash
210
+ docker compose --profile codegen-xeon-tgi up -d
211
+ ```
212
+
213
+ vLLM service:
214
+ ``` bash
215
+ docker compose --profile codegen-xeon-vllm up -d
216
+ ```
217
+
201
218
### Validate the MicroServices and MegaService
202
219
203
220
1 . LLM Service (for TGI, vLLM)
You can’t perform that action at this time.
0 commit comments