Skip to content

Commit

Permalink
[DOCS] Adds explanation to the example.
Browse files Browse the repository at this point in the history
  • Loading branch information
szabosteve committed Oct 31, 2024
1 parent dae7544 commit 0bfc242
Showing 1 changed file with 7 additions and 3 deletions.
10 changes: 7 additions & 3 deletions docs/reference/ingest/processors/inference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -771,8 +771,10 @@ PUT _ingest/pipeline/query_helper_pipeline
]
}
--------------------------------------------------
// NOTCONSOLE
// TEST[skip: An inference endpoint is required.]
<1> The `prompt` field contains the prompt for the completion task.
We use <<modules-scripting-painless,Painless>> to construct the prompt.
With `+ ctx.content`, we append the natural language input as part of the generated prompt.
<2> The ID of the {infer} endpoint created upfront that uses the <<infer-service-openai,`openai` service>> with the `completion` task type.


Expand All @@ -783,13 +785,15 @@ POST _ingest/pipeline/query_helper_pipeline/_simulate
"docs": [
{
"_source": {
"content": "artificial intelligence in medicine articles published in the last 12 months"
"content": "artificial intelligence in medicine articles published in the last 12 months" <1>
}
}
]
}
--------------------------------------------------
// NOTCONSOLE
// TEST[skip: An inference processor with an inference endpoint is required.]
<1> The natural language query to be used to generate an {es} query as part of the prompt scripted in the {infer} processor.


[discrete]
[[infer-proc-readings]]
Expand Down

0 comments on commit 0bfc242

Please sign in to comment.