Skip to content

Commit 05365b6

Browse files
authored
FaqGen param fix (#1277)
Signed-off-by: Xinyao Wang <xinyao.wang@intel.com>
1 parent fd706d1 commit 05365b6

File tree

4 files changed

+6
-6
lines changed

4 files changed

+6
-6
lines changed

FaqGen/docker_compose/intel/cpu/xeon/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ docker compose up -d
119119
-H "Content-Type: multipart/form-data" \
120120
-F "messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5." \
121121
-F "max_tokens=32" \
122-
-F "stream=false"
122+
-F "stream=False"
123123
```
124124

125125
Following the validation of all aforementioned microservices, we are now prepared to construct a mega-service.

FaqGen/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ docker compose up -d
120120
-H "Content-Type: multipart/form-data" \
121121
-F "messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5." \
122122
-F "max_tokens=32" \
123-
-F "stream=false"
123+
-F "stream=False"
124124
```
125125

126126
## 🚀 Launch the UI

FaqGen/tests/test_compose_on_gaudi.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -107,11 +107,11 @@ function validate_megaservice() {
107107
local EXPECTED_RESULT="Embeddings"
108108
local INPUT_DATA="messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
109109
local URL="${ip_address}:8888/v1/faqgen"
110-
local HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" -X POST -F "$INPUT_DATA" -H 'Content-Type: multipart/form-data' "$URL")
110+
local HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" -X POST -F "$INPUT_DATA" -F "max_tokens=32" -F "stream=False" -H 'Content-Type: multipart/form-data' "$URL")
111111
if [ "$HTTP_STATUS" -eq 200 ]; then
112112
echo "[ $SERVICE_NAME ] HTTP status is 200. Checking content..."
113113

114-
local CONTENT=$(curl -s -X POST -F "$INPUT_DATA" -H 'Content-Type: multipart/form-data' "$URL" | tee ${LOG_PATH}/${SERVICE_NAME}.log)
114+
local CONTENT=$(curl -s -X POST -F "$INPUT_DATA" -F "max_tokens=32" -F "stream=False" -H 'Content-Type: multipart/form-data' "$URL" | tee ${LOG_PATH}/${SERVICE_NAME}.log)
115115

116116
if echo "$CONTENT" | grep -q "$EXPECTED_RESULT"; then
117117
echo "[ $SERVICE_NAME ] Content is as expected."

FaqGen/tests/test_compose_on_xeon.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -107,11 +107,11 @@ function validate_megaservice() {
107107
local EXPECTED_RESULT="Embeddings"
108108
local INPUT_DATA="messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."
109109
local URL="${ip_address}:8888/v1/faqgen"
110-
local HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" -X POST -F "$INPUT_DATA" -H 'Content-Type: multipart/form-data' "$URL")
110+
local HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" -X POST -F "$INPUT_DATA" -F "max_tokens=32" -F "stream=False" -H 'Content-Type: multipart/form-data' "$URL")
111111
if [ "$HTTP_STATUS" -eq 200 ]; then
112112
echo "[ $SERVICE_NAME ] HTTP status is 200. Checking content..."
113113

114-
local CONTENT=$(curl -s -X POST -F "$INPUT_DATA" -H 'Content-Type: multipart/form-data' "$URL" | tee ${LOG_PATH}/${SERVICE_NAME}.log)
114+
local CONTENT=$(curl -s -X POST -F "$INPUT_DATA" -F "max_tokens=32" -F "stream=False" -H 'Content-Type: multipart/form-data' "$URL" | tee ${LOG_PATH}/${SERVICE_NAME}.log)
115115

116116
if echo "$CONTENT" | grep -q "$EXPECTED_RESULT"; then
117117
echo "[ $SERVICE_NAME ] Content is as expected."

0 commit comments

Comments
 (0)