Skip to content

Update Multimodal Docker File Path #1252

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Dec 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion MultimodalQnA/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,7 @@ docker build --no-cache -t opea/embedding-multimodal:latest --build-arg https_pr
### 2. Build retriever-multimodal-redis Image

```bash
docker build --no-cache -t opea/retriever-multimodal-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/multimodal/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
```

### 3. Build LVM Images
Expand Down
9 changes: 5 additions & 4 deletions MultimodalQnA/docker_compose/intel/cpu/xeon/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -74,9 +74,9 @@ services:
MMEI_EMBEDDING_ENDPOINT: ${MMEI_EMBEDDING_ENDPOINT}
MM_EMBEDDING_PORT_MICROSERVICE: ${MM_EMBEDDING_PORT_MICROSERVICE}
restart: unless-stopped
retriever-multimodal-redis:
image: ${REGISTRY:-opea}/retriever-multimodal-redis:${TAG:-latest}
container_name: retriever-multimodal-redis
retriever-redis:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
container_name: retriever-redis
depends_on:
- redis-vector-db
ports:
Expand All @@ -88,6 +88,7 @@ services:
https_proxy: ${https_proxy}
REDIS_URL: ${REDIS_URL}
INDEX_NAME: ${INDEX_NAME}
BRIDGE_TOWER_EMBEDDING: ${BRIDGE_TOWER_EMBEDDING}
restart: unless-stopped
lvm-llava:
image: ${REGISTRY:-opea}/lvm-llava:${TAG:-latest}
Expand Down Expand Up @@ -121,7 +122,7 @@ services:
- redis-vector-db
- dataprep-multimodal-redis
- embedding-multimodal
- retriever-multimodal-redis
- retriever-redis
- lvm-llava-svc
- asr
ports:
Expand Down
2 changes: 1 addition & 1 deletion MultimodalQnA/docker_compose/intel/hpu/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ docker build --no-cache -t opea/embedding-multimodal:latest --build-arg https_pr
### 2. Build retriever-multimodal-redis Image

```bash
docker build --no-cache -t opea/retriever-multimodal-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/multimodal/redis/langchain/Dockerfile .
docker build --no-cache -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/redis/langchain/Dockerfile .
```

### 3. Build LVM Images
Expand Down
9 changes: 5 additions & 4 deletions MultimodalQnA/docker_compose/intel/hpu/gaudi/compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -74,9 +74,9 @@ services:
MMEI_EMBEDDING_ENDPOINT: ${MMEI_EMBEDDING_ENDPOINT}
MM_EMBEDDING_PORT_MICROSERVICE: ${MM_EMBEDDING_PORT_MICROSERVICE}
restart: unless-stopped
retriever-multimodal-redis:
image: ${REGISTRY:-opea}/retriever-multimodal-redis:${TAG:-latest}
container_name: retriever-multimodal-redis
retriever-redis:
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
container_name: retriever-redis
depends_on:
- redis-vector-db
ports:
Expand All @@ -88,6 +88,7 @@ services:
https_proxy: ${https_proxy}
REDIS_URL: ${REDIS_URL}
INDEX_NAME: ${INDEX_NAME}
BRIDGE_TOWER_EMBEDDING: ${BRIDGE_TOWER_EMBEDDING}
restart: unless-stopped
tgi-gaudi:
image: ghcr.io/huggingface/tgi-gaudi:2.0.6
Expand Down Expand Up @@ -138,7 +139,7 @@ services:
- redis-vector-db
- dataprep-multimodal-redis
- embedding-multimodal
- retriever-multimodal-redis
- retriever-redis
- lvm-tgi
- asr
ports:
Expand Down
6 changes: 3 additions & 3 deletions MultimodalQnA/docker_image_build/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,12 @@ services:
dockerfile: comps/embeddings/multimodal/multimodal_langchain/Dockerfile
extends: multimodalqna
image: ${REGISTRY:-opea}/embedding-multimodal:${TAG:-latest}
retriever-multimodal-redis:
retriever-redis:
build:
context: GenAIComps
dockerfile: comps/retrievers/multimodal/redis/langchain/Dockerfile
dockerfile: comps/retrievers/redis/langchain/Dockerfile
extends: multimodalqna
image: ${REGISTRY:-opea}/retriever-multimodal-redis:${TAG:-latest}
image: ${REGISTRY:-opea}/retriever-redis:${TAG:-latest}
lvm-llava:
build:
context: GenAIComps
Expand Down
2 changes: 1 addition & 1 deletion MultimodalQnA/multimodalqna.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def add_remote_service(self):
name="retriever",
host=MM_RETRIEVER_SERVICE_HOST_IP,
port=MM_RETRIEVER_SERVICE_PORT,
endpoint="/v1/multimodal_retrieval",
endpoint="/v1/retrieval",
use_remote_service=True,
service_type=ServiceType.RETRIEVER,
)
Expand Down
11 changes: 6 additions & 5 deletions MultimodalQnA/tests/test_compose_on_gaudi.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ function build_docker_images() {
cd $WORKPATH/docker_image_build
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
echo "Build all the images with --no-cache, check docker_image_build.log for details..."
service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding-multimodal retriever-multimodal-redis lvm-tgi dataprep-multimodal-redis whisper asr"
service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding-multimodal retriever-redis lvm-tgi dataprep-multimodal-redis whisper asr"
docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log

docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6
Expand All @@ -41,6 +41,7 @@ function setup_env() {
export REDIS_URL="redis://${host_ip}:6379"
export REDIS_HOST=${host_ip}
export INDEX_NAME="mm-rag-redis"
export BRIDGE_TOWER_EMBEDDING=true
export LLAVA_SERVER_PORT=8399
export LVM_ENDPOINT="http://${host_ip}:8399"
export EMBEDDING_MODEL_ID="BridgeTower/bridgetower-large-itm-mlm-itc"
Expand Down Expand Up @@ -192,13 +193,13 @@ function validate_microservices() {
sleep 1m

# multimodal retrieval microservice
echo "Validating retriever-multimodal-redis"
echo "Validating retriever-redis"
your_embedding=$(python3 -c "import random; embedding = [random.uniform(-1, 1) for _ in range(512)]; print(embedding)")
validate_service \
"http://${host_ip}:7000/v1/multimodal_retrieval" \
"http://${host_ip}:7000/v1/retrieval" \
"retrieved_docs" \
"retriever-multimodal-redis" \
"retriever-multimodal-redis" \
"retriever-redis" \
"retriever-redis" \
"{\"text\":\"test\",\"embedding\":${your_embedding}}"

sleep 3m
Expand Down
11 changes: 6 additions & 5 deletions MultimodalQnA/tests/test_compose_on_xeon.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ function build_docker_images() {
cd $WORKPATH/docker_image_build
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../
echo "Build all the images with --no-cache, check docker_image_build.log for details..."
service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding-multimodal retriever-multimodal-redis lvm-llava lvm-llava-svc dataprep-multimodal-redis whisper asr"
service_list="multimodalqna multimodalqna-ui embedding-multimodal-bridgetower embedding-multimodal retriever-redis lvm-llava lvm-llava-svc dataprep-multimodal-redis whisper asr"
docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log

docker images && sleep 1m
Expand All @@ -39,6 +39,7 @@ function setup_env() {
export REDIS_URL="redis://${host_ip}:6379"
export REDIS_HOST=${host_ip}
export INDEX_NAME="mm-rag-redis"
export BRIDGE_TOWER_EMBEDDING=true
export LLAVA_SERVER_PORT=8399
export LVM_ENDPOINT="http://${host_ip}:8399"
export LVM_MODEL_ID="llava-hf/llava-1.5-7b-hf"
Expand Down Expand Up @@ -190,13 +191,13 @@ function validate_microservices() {
sleep 1m

# multimodal retrieval microservice
echo "Validating retriever-multimodal-redis"
echo "Validating retriever-redis"
your_embedding=$(python3 -c "import random; embedding = [random.uniform(-1, 1) for _ in range(512)]; print(embedding)")
validate_service \
"http://${host_ip}:7000/v1/multimodal_retrieval" \
"http://${host_ip}:7000/v1/retrieval" \
"retrieved_docs" \
"retriever-multimodal-redis" \
"retriever-multimodal-redis" \
"retriever-redis" \
"retriever-redis" \
"{\"text\":\"test\",\"embedding\":${your_embedding}}"

sleep 3m
Expand Down
1 change: 0 additions & 1 deletion docker_images_list.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,6 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the
| [opea/reranking-langchain-mosec-endpoint](https://hub.docker.com/r/opea/reranking-langchain-mosec-endpoint) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/reranks/mosec/langchain/dependency/Dockerfile) | The docker image exposed the OPEA mosec reranking endpoint microservice base on Langchain framework for GenAI application use |
| [opea/reranking-tei](https://hub.docker.com/r/opea/reranking-tei) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/reranks/tei/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use |
| [opea/retriever-milvus](https://hub.docker.com/r/opea/retriever-milvus) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/milvus/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use |
| [opea/retriever-multimodal-redis](https://hub.docker.com/r/opea/retriever-multimodal-redis) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/multimodal/redis/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on multimodal redis vectordb for GenAI application use |
| [opea/retriever-pathway](https://hub.docker.com/r/opea/retriever-pathway) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pathway/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice with pathway for GenAI application use |
| [opea/retriever-pgvector](https://hub.docker.com/r/opea/retriever-pgvector) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pgvector/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on pgvector vectordb for GenAI application use |
| [opea/retriever-pinecone](https://hub.docker.com/r/opea/retriever-pinecone) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pinecone/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on pinecone vectordb for GenAI application use |
Expand Down
Loading