Skip to content

Commit 35d790a

Browse files
[pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
1 parent 0a47d2f commit 35d790a

File tree

5 files changed

+19
-21
lines changed

5 files changed

+19
-21
lines changed

CodeGen/docker_compose/intel/cpu/xeon/README.md

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -155,8 +155,8 @@ Then run the command `docker images`, you will have the following Docker images:
155155
- `redis/redis-stack`
156156
- `opea/vllm`
157157

158-
159158
### Building the Docker image locally
159+
160160
Should the Docker image you seek not yet be available on Docker Hub, you can build the Docker image locally.
161161
In order to build the Docker image locally follow the instrustion provided below.
162162

@@ -170,7 +170,7 @@ cd GenAIExamples/CodeGen
170170
docker build -t opea/codegen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
171171
```
172172

173-
#### Build the UI Gradio Image
173+
#### Build the UI Gradio Image
174174

175175
Build the frontend Gradio image via the command below:
176176

@@ -180,25 +180,24 @@ docker build -t opea/codegen-gradio-ui:latest --build-arg https_proxy=$https_pro
180180
```
181181

182182
#### Dataprep Microservice with Redis
183-
Follow the instrustion provided here: [opea/dataprep](https://github.com/MSCetin37/GenAIComps/blob/main/comps/dataprep/src/README_redis.md)
184183

184+
Follow the instrustion provided here: [opea/dataprep](https://github.com/MSCetin37/GenAIComps/blob/main/comps/dataprep/src/README_redis.md)
185185

186186
#### Embedding Microservice with TEI
187-
Follow the instrustion provided here: [opea/embedding](https://github.com/MSCetin37/GenAIComps/blob/main/comps/embeddings/src/README_tei.md)
188187

188+
Follow the instrustion provided here: [opea/embedding](https://github.com/MSCetin37/GenAIComps/blob/main/comps/embeddings/src/README_tei.md)
189189

190190
#### LLM text generation Microservice
191-
Follow the instrustion provided here: [opea/llm-textgen](https://github.com/MSCetin37/GenAIComps/tree/main/comps/llms/src/text-generation)
192191

192+
Follow the instrustion provided here: [opea/llm-textgen](https://github.com/MSCetin37/GenAIComps/tree/main/comps/llms/src/text-generation)
193193

194194
#### Retriever Microservice
195-
Follow the instrustion provided here: [opea/retriever](https://github.com/MSCetin37/GenAIComps/blob/main/comps/retrievers/src/README_redis.md)
196195

196+
Follow the instrustion provided here: [opea/retriever](https://github.com/MSCetin37/GenAIComps/blob/main/comps/retrievers/src/README_redis.md)
197197

198198
#### Start Redis server
199-
Follow the instrustion provided here: [redis/redis-stack](https://github.com/MSCetin37/GenAIComps/tree/main/comps/third_parties/redis/src)
200-
201199

200+
Follow the instrustion provided here: [redis/redis-stack](https://github.com/MSCetin37/GenAIComps/tree/main/comps/third_parties/redis/src)
202201

203202
### Validate the MicroServices and MegaService
204203

CodeGen/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -150,8 +150,8 @@ Then run the command `docker images`, you will have the following Docker images:
150150

151151
Refer to the [Gaudi Guide](./README.md) to build docker images from source.
152152

153-
154153
### Building the Docker image locally
154+
155155
Should the Docker image you seek not yet be available on Docker Hub, you can build the Docker image locally.
156156
In order to build the Docker image locally follow the instrustion provided below.
157157

@@ -165,7 +165,7 @@ cd GenAIExamples/CodeGen
165165
docker build -t opea/codegen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
166166
```
167167

168-
#### Build the UI Gradio Image
168+
#### Build the UI Gradio Image
169169

170170
Build the frontend Gradio image via the command below:
171171

@@ -175,25 +175,24 @@ docker build -t opea/codegen-gradio-ui:latest --build-arg https_proxy=$https_pro
175175
```
176176

177177
#### Dataprep Microservice with Redis
178-
Follow the instrustion provided here: [opea/dataprep](https://github.com/MSCetin37/GenAIComps/blob/main/comps/dataprep/src/README_redis.md)
179178

179+
Follow the instrustion provided here: [opea/dataprep](https://github.com/MSCetin37/GenAIComps/blob/main/comps/dataprep/src/README_redis.md)
180180

181181
#### Embedding Microservice with TEI
182-
Follow the instrustion provided here: [opea/embedding](https://github.com/MSCetin37/GenAIComps/blob/main/comps/embeddings/src/README_tei.md)
183182

183+
Follow the instrustion provided here: [opea/embedding](https://github.com/MSCetin37/GenAIComps/blob/main/comps/embeddings/src/README_tei.md)
184184

185185
#### LLM text generation Microservice
186-
Follow the instrustion provided here: [opea/llm-textgen](https://github.com/MSCetin37/GenAIComps/tree/main/comps/llms/src/text-generation)
187186

187+
Follow the instrustion provided here: [opea/llm-textgen](https://github.com/MSCetin37/GenAIComps/tree/main/comps/llms/src/text-generation)
188188

189189
#### Retriever Microservice
190-
Follow the instrustion provided here: [opea/retriever](https://github.com/MSCetin37/GenAIComps/blob/main/comps/retrievers/src/README_redis.md)
191190

191+
Follow the instrustion provided here: [opea/retriever](https://github.com/MSCetin37/GenAIComps/blob/main/comps/retrievers/src/README_redis.md)
192192

193193
#### Start Redis server
194-
Follow the instrustion provided here: [redis/redis-stack](https://github.com/MSCetin37/GenAIComps/tree/main/comps/third_parties/redis/src)
195-
196194

195+
Follow the instrustion provided here: [redis/redis-stack](https://github.com/MSCetin37/GenAIComps/tree/main/comps/third_parties/redis/src)
197196

198197
### Validate the MicroServices and MegaService
199198

CodeGen/docker_image_build/build.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ services:
5151
context: vllm-fork
5252
dockerfile: Dockerfile.hpu
5353
extends: codegen
54-
image: ${REGISTRY:-opea}/vllm-gaudi:${TAG:-latest}
54+
image: ${REGISTRY:-opea}/vllm-gaudi:${TAG:-latest}
5555
dataprep:
5656
build:
5757
context: GenAIComps
@@ -63,4 +63,4 @@ services:
6363
context: GenAIComps
6464
dockerfile: comps/retrievers/src/Dockerfile
6565
extends: codegen
66-
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}
66+
image: ${REGISTRY:-opea}/retriever:${TAG:-latest}

CodeGen/tests/test_compose_on_gaudi.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,14 +70,14 @@ function start_services() {
7070

7171
export REDIS_URL="redis://${host_ip}:${REDIS_DB_PORT}"
7272
export RETRIEVAL_SERVICE_HOST_IP=${host_ip}
73-
73+
7474
export EMBEDDING_MODEL_ID="BAAI/bge-base-en-v1.5"
7575
export TEI_EMBEDDING_HOST_IP=${host_ip}
7676
export TEI_EMBEDDING_ENDPOINT="http://${host_ip}:${TEI_EMBEDDER_PORT}"
7777
export DATAPREP_ENDPOINT="http://${host_ip}:${DATAPREP_REDIS_PORT}/v1/dataprep"
7878

7979
export INDEX_NAME="CodeGen"
80-
80+
8181
# Start Docker Containers
8282
docker compose --profile ${compose_profile} up -d | tee ${LOG_PATH}/start_services_with_compose.log
8383

CodeGen/tests/test_compose_on_xeon.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ function start_services() {
7171

7272
export REDIS_URL="redis://${host_ip}:${REDIS_DB_PORT}"
7373
export RETRIEVAL_SERVICE_HOST_IP=${host_ip}
74-
74+
7575
export EMBEDDING_MODEL_ID="BAAI/bge-base-en-v1.5"
7676
export TEI_EMBEDDING_HOST_IP=${host_ip}
7777
export TEI_EMBEDDING_ENDPOINT="http://${host_ip}:${TEI_EMBEDDER_PORT}"

0 commit comments

Comments
 (0)