Skip to content

New Productivity Suite react UI and Bug Fixes #1834

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 36 commits into from
Apr 21, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
86f70d9
New React UI for ProductivitySuite which contains ChatQnA, Odcument S…
sgurunat Apr 15, 2025
db5e764
updated docker compose, set_env and README of ProductivitySuite
sgurunat Apr 15, 2025
8adcee6
removed FaqGen in README of ProductivitySuite UI
sgurunat Apr 15, 2025
e39cd4f
Removed FaqGen in env.production and README of ProductivitySuite UI
sgurunat Apr 15, 2025
43e10a8
Fixed Docsum, CodeGen and Data Management issues. Removed material ui…
Apr 17, 2025
58c78f0
resolved merge conflict with main branch
sgurunat Apr 17, 2025
c0c9e16
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 17, 2025
77886ce
Fixed prettier issue
sgurunat Apr 17, 2025
bee944b
resolved merged conflict
sgurunat Apr 17, 2025
2c8d90f
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 17, 2025
7aa6790
fixed prettier issue
sgurunat Apr 17, 2025
ce734ff
fixed merge conflict
sgurunat Apr 17, 2025
6da1825
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 17, 2025
abc066c
Fixed copilot raised issue and codespell issue
sgurunat Apr 17, 2025
0a8296b
FIxed CI issue related to compose file
sgurunat Apr 17, 2025
f07315b
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 17, 2025
044cb1d
Updated build.yaml in ProductivitySuite to include docsum and whisper…
sgurunat Apr 17, 2025
ade4bf6
updated build.yaml in ProductivitySuite
sgurunat Apr 17, 2025
b2d3370
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 17, 2025
f2e7696
Removed Whisper section in build.yaml of ProductivitySuite
sgurunat Apr 17, 2025
682184c
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 17, 2025
0b35f95
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 17, 2025
6ff0d24
Fixed prettier issue in ProductivitySuite
sgurunat Apr 17, 2025
152b6dc
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Apr 17, 2025
ab140fe
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 17, 2025
5ac0c1d
Fixed CI issue on docker compose file
sgurunat Apr 17, 2025
2193996
added whisper in build.yaml
sgurunat Apr 17, 2025
31df27a
Added docsum to the build.yaml in ProductivitySuite
sgurunat Apr 17, 2025
998fdd4
Fixing CI issue
sgurunat Apr 17, 2025
942c4c8
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 17, 2025
04136ea
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 21, 2025
d68b441
Updated test_compose_on_xeon.sh file in ProductivitySuite
sgurunat Apr 21, 2025
5828296
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 21, 2025
20abdce
Updated test compose file, README and model_config file in Productivi…
sgurunat Apr 21, 2025
06a0c84
Added sample test file for ProductivitySuite UI
sgurunat Apr 21, 2025
2a4466c
Merge branch 'main' into new-ps-react-ui
sgurunat Apr 21, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 2 additions & 14 deletions ProductivitySuite/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,7 @@ flowchart LR
direction LR
LLM_CG([LLM MicroService]):::blue
end
subgraph FaqGen-MegaService["FaqGen MegaService "]
direction LR
LLM_F([LLM MicroService]):::blue
end

subgraph UserInterface[" User Interface "]
direction LR
a([User Input Query]):::orchid
Expand All @@ -63,7 +60,7 @@ flowchart LR
LLM_gen_CG{{LLM Service <br>}}
GW_CG([CodeGen GateWay<br>]):::orange
LLM_gen_F{{LLM Service <br>}}
GW_F([FaqGen GateWay<br>]):::orange

PR([Prompt Registry MicroService]):::blue
CH([Chat History MicroService]):::blue
MDB{{Mongo DB<br><br>}}
Expand Down Expand Up @@ -118,11 +115,6 @@ flowchart LR
direction LR
LLM_CG <-.-> LLM_gen_CG

%% Questions interaction
direction LR
UI --> GW_F
GW_F <==> FaqGen-MegaService


%% Embedding service flow
direction LR
Expand Down Expand Up @@ -158,10 +150,6 @@ Engage in intelligent conversations with your documents using our advanced **Ret

Summarize lengthy documents or articles, enabling you to grasp key takeaways quickly. Save time and effort with our intelligent summarization feature!

### ❓ FAQ Generation

Effortlessly create comprehensive FAQs based on your documents. Ensure your users have access to the information they need with minimal effort!

### 💻 Code Generation

Boost your coding productivity by providing a description of the functionality you require. Our application generates corresponding code snippets, saving you valuable time and effort!
Expand Down
Binary file modified ProductivitySuite/assets/img/Login_page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified ProductivitySuite/assets/img/chat_qna_init.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified ProductivitySuite/assets/img/chatqna_with_conversation.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified ProductivitySuite/assets/img/codegen.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified ProductivitySuite/assets/img/data_source.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ProductivitySuite/assets/img/doc_summary.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
160 changes: 46 additions & 114 deletions ProductivitySuite/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,68 +108,46 @@ Since the `compose.yaml` will consume some environment variables, you need to se
export host_ip="External_Public_IP"
```

**Export the value of your Huggingface API token to the `your_hf_api_token` environment variable**
**Export the value of your Huggingface API token to the `HUGGINGFACEHUB_API_TOKEN` environment variable**

> Change the Your_Huggingface_API_Token below with tyour actual Huggingface API Token value

```
export your_hf_api_token="Your_Huggingface_API_Token"
export HUGGINGFACEHUB_API_TOKEN="Your_Huggingface_API_Token"
```

**Append the value of the public IP address to the no_proxy list**

```
export your_no_proxy=${your_no_proxy},"External_Public_IP"
export no_proxy=${no_proxy},"External_Public_IP"
```

```bash
export MONGO_HOST=${host_ip}
export MONGO_PORT=27017
export DB_NAME="test"
export COLLECTION_NAME="Conversations"
export DB_NAME="opea"
export EMBEDDING_MODEL_ID="BAAI/bge-base-en-v1.5"
export RERANK_MODEL_ID="BAAI/bge-reranker-base"
export LLM_MODEL_ID="Intel/neural-chat-7b-v3-3"
export LLM_MODEL_ID_CODEGEN="meta-llama/CodeLlama-7b-hf"
export TEI_EMBEDDING_ENDPOINT="http://${host_ip}:6006"
export TEI_RERANKING_ENDPOINT="http://${host_ip}:8808"
export TGI_LLM_ENDPOINT="http://${host_ip}:9009"
export REDIS_URL="redis://${host_ip}:6379"
export INDEX_NAME="rag-redis"
export HUGGINGFACEHUB_API_TOKEN=${your_hf_api_token}
export MEGA_SERVICE_HOST_IP=${host_ip}
export EMBEDDING_SERVICE_HOST_IP=${host_ip}
export RETRIEVER_SERVICE_HOST_IP=${host_ip}
export RERANK_SERVICE_HOST_IP=${host_ip}
export LLM_SERVICE_HOST_IP=${host_ip}
export LLM_SERVICE_HOST_IP_DOCSUM=${host_ip}
export LLM_SERVICE_HOST_IP_FAQGEN=${host_ip}
export LLM_SERVICE_HOST_IP_CODEGEN=${host_ip}
export LLM_SERVICE_HOST_IP_CHATQNA=${host_ip}
export TGI_LLM_ENDPOINT_CHATQNA="http://${host_ip}:9009"
export TGI_LLM_ENDPOINT_CODEGEN="http://${host_ip}:8028"
export TGI_LLM_ENDPOINT_FAQGEN="http://${host_ip}:9009"
export TGI_LLM_ENDPOINT_DOCSUM="http://${host_ip}:9009"
export HUGGINGFACEHUB_API_TOKEN=${HUGGINGFACEHUB_API_TOKEN}
export BACKEND_SERVICE_ENDPOINT_CHATQNA="http://${host_ip}:8888/v1/chatqna"
export DATAPREP_DELETE_FILE_ENDPOINT="http://${host_ip}:5000/v1/dataprep/delete"
export DATAPREP_DELETE_FILE_ENDPOINT="http://${host_ip}:6007/v1/dataprep/delete"
export BACKEND_SERVICE_ENDPOINT_CODEGEN="http://${host_ip}:7778/v1/codegen"
export BACKEND_SERVICE_ENDPOINT_DOCSUM="http://${host_ip}:8890/v1/docsum"
export DATAPREP_SERVICE_ENDPOINT="http://${host_ip}:5000/v1/dataprep/ingest"
export DATAPREP_GET_FILE_ENDPOINT="http://${host_ip}:5000/v1/dataprep/get"
export DATAPREP_SERVICE_ENDPOINT="http://${host_ip}:6007/v1/dataprep/ingest"
export DATAPREP_GET_FILE_ENDPOINT="http://${host_ip}:6007/v1/dataprep/get"
export CHAT_HISTORY_CREATE_ENDPOINT="http://${host_ip}:6012/v1/chathistory/create"
export CHAT_HISTORY_CREATE_ENDPOINT="http://${host_ip}:6012/v1/chathistory/create"
export CHAT_HISTORY_DELETE_ENDPOINT="http://${host_ip}:6012/v1/chathistory/delete"
export CHAT_HISTORY_GET_ENDPOINT="http://${host_ip}:6012/v1/chathistory/get"
export PROMPT_SERVICE_GET_ENDPOINT="http://${host_ip}:6018/v1/prompt/get"
export PROMPT_SERVICE_CREATE_ENDPOINT="http://${host_ip}:6018/v1/prompt/create"
export PROMPT_SERVICE_DELETE_ENDPOINT="http://${host_ip}:6018/v1/prompt/delete"
export KEYCLOAK_SERVICE_ENDPOINT="http://${host_ip}:8080"
export LLM_SERVICE_HOST_PORT_FAQGEN=9002
export LLM_SERVICE_HOST_PORT_CODEGEN=9001
export LLM_SERVICE_HOST_PORT_DOCSUM=9003
export PROMPT_COLLECTION_NAME="prompt"
export RERANK_SERVER_PORT=8808
export EMBEDDING_SERVER_PORT=6006
export LLM_SERVER_PORT=9009
export DocSum_COMPONENT_NAME="OpeaDocSumTgi"

#Set no proxy
export no_proxy="$no_proxy,tgi_service_codegen,llm_codegen,tei-embedding-service,tei-reranking-service,chatqna-xeon-backend-server,retriever,tgi-service,redis-vector-db,whisper,llm-docsum-tgi,docsum-xeon-backend-server,mongo,codegen"
```

Note: Please replace with `host_ip` with you external IP address, do not use localhost.
Expand Down Expand Up @@ -203,16 +181,7 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det
-H 'Content-Type: application/json'
```

2. Embedding Microservice

```bash
curl http://${host_ip}:6000/v1/embeddings\
-X POST \
-d '{"text":"hello"}' \
-H 'Content-Type: application/json'
```

3. Retriever Microservice
2. Retriever Microservice

To consume the retriever microservice, you need to generate a mock embedding vector by Python script. The length of embedding vector
is determined by the embedding model.
Expand All @@ -222,13 +191,13 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det

```bash
export your_embedding=$(python3 -c "import random; embedding = [random.uniform(-1, 1) for _ in range(768)]; print(embedding)")
curl http://${host_ip}:7000/v1/retrieval \
curl http://${host_ip}:7001/v1/retrieval \
-X POST \
-d "{\"text\":\"test\",\"embedding\":${your_embedding}}" \
-H 'Content-Type: application/json'
```

4. TEI Reranking Service
3. TEI Reranking Service

```bash
curl http://${host_ip}:8808/rerank \
Expand All @@ -237,16 +206,7 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det
-H 'Content-Type: application/json'
```

5. Reranking Microservice

```bash
curl http://${host_ip}:8000/v1/reranking\
-X POST \
-d '{"initial_query":"What is Deep Learning?", "retrieved_docs": [{"text":"Deep Learning is not..."}, {"text":"Deep learning is..."}]}' \
-H 'Content-Type: application/json'
```

6. LLM backend Service (ChatQnA, DocSum, FAQGen)
4. LLM backend Service (ChatQnA, DocSum)

```bash
curl http://${host_ip}:9009/generate \
Expand All @@ -255,7 +215,7 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det
-H 'Content-Type: application/json'
```

7. LLM backend Service (CodeGen)
5. LLM backend Service (CodeGen)

```bash
curl http://${host_ip}:8028/generate \
Expand All @@ -264,67 +224,50 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det
-H 'Content-Type: application/json'
```

8. ChatQnA LLM Microservice
6. CodeGen LLM Microservice

```bash
curl http://${host_ip}:9000/v1/chat/completions\
curl http://${host_ip}:9001/v1/chat/completions\
-X POST \
-d '{"query":"What is Deep Learning?","max_tokens":17,"top_k":10,"top_p":0.95,"typical_p":0.95,"temperature":0.01,"repetition_penalty":1.03,"stream":true}' \
-d '{"query":"def print_hello_world():"}' \
-H 'Content-Type: application/json'
```

9. CodeGen LLM Microservice
7. DocSum LLM Microservice

```bash
curl http://${host_ip}:9001/v1/chat/completions\
curl http://${host_ip}:9003/v1/docsum\
-X POST \
-d '{"query":"def print_hello_world():"}' \
-d '{"messages":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5", "type": "text"}' \
-H 'Content-Type: application/json'
```

10. DocSum LLM Microservice

```bash
curl http://${host_ip}:9003/v1/docsum\
-X POST \
-d '{"query":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5"}' \
-H 'Content-Type: application/json'
```

11. FAQGen LLM Microservice
8. ChatQnA MegaService

```bash
curl http://${host_ip}:9002/v1/faqgen\
-X POST \
-d '{"query":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5"}' \
-H 'Content-Type: application/json'
```

12. ChatQnA MegaService

```bash
curl http://${host_ip}:8888/v1/chatqna -H "Content-Type: application/json" -d '{
"messages": "What is the revenue of Nike in 2023?"
}'
```
```bash
curl http://${host_ip}:8888/v1/chatqna -H "Content-Type: application/json" -d '{
"messages": "What is the revenue of Nike in 2023?"
}'
```

13. DocSum MegaService
9. DocSum MegaService

```bash
curl http://${host_ip}:8890/v1/docsum -H "Content-Type: application/json" -d '{
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
}'
```
```bash
curl http://${host_ip}:8890/v1/docsum -H "Content-Type: application/json" -d '{
"messages": "Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.",
"type": "text"
}'
```

14. CodeGen MegaService
10. CodeGen MegaService

```bash
curl http://${host_ip}:7778/v1/codegen -H "Content-Type: application/json" -d '{
"messages": "def print_hello_world():"
}'
```

15. Dataprep Microservice
11. Dataprep Microservice

If you want to update the default knowledge base, you can use the following commands:

Expand Down Expand Up @@ -374,13 +317,13 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det
-H "Content-Type: application/json"
```

16. Prompt Registry Microservice
12. Prompt Registry Microservice

If you want to update the default Prompts in the application for your user, you can use the following commands:

```bash
curl -X 'POST' \
http://{host_ip}:6018/v1/prompt/create \
"http://${host_ip}:6018/v1/prompt/create" \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
Expand All @@ -392,14 +335,14 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det

```bash
curl -X 'POST' \
http://{host_ip}:6018/v1/prompt/get \
"http://${host_ip}:6018/v1/prompt/get" \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"user": "test"}'

curl -X 'POST' \
http://{host_ip}:6018/v1/prompt/get \
"http://${host_ip}:6018/v1/prompt/get" \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
Expand All @@ -410,14 +353,14 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det

```bash
curl -X 'POST' \
http://{host_ip}:6018/v1/prompt/delete \
"http://${host_ip}:6018/v1/prompt/delete" \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"user": "test", "prompt_id":"{prompt_id to be deleted}"}'
```

17. Chat History Microservice
13. Chat History Microservice

To validate the chatHistory Microservice, you can use the following commands.

Expand Down Expand Up @@ -527,15 +470,4 @@ Here're some of the project's features:

#### Screenshots

![project-screenshot](../../../../assets/img/doc_summary_paste.png)
![project-screenshot](../../../../assets/img/doc_summary_file.png)

### ❓ FAQ Generator

- **Generate FAQs from Text via Pasting**: Paste the text to into the text box, then click 'Generate FAQ' to produce a condensed FAQ of the content, which will be displayed in the 'FAQ' box below.

- **Generate FAQs from Text via txt file Upload**: Upload the file in the Upload bar, then click 'Generate FAQ' to produce a condensed FAQ of the content, which will be displayed in the 'FAQ' box below.

#### Screenshots

![project-screenshot](../../../../assets/img/faq_generator.png)
![project-screenshot](../../../../assets/img/doc_summary.png)
Loading