Skip to content

Commit

Permalink
Auto-generated API code (#2625)
Browse files Browse the repository at this point in the history
  • Loading branch information
elasticmachine authored Feb 24, 2025
1 parent 25e8e84 commit a003078
Show file tree
Hide file tree
Showing 3 changed files with 21 additions and 5 deletions.
10 changes: 5 additions & 5 deletions docs/reference.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -4047,8 +4047,8 @@ client.ilm.start({ ... })
==== Arguments

* *Request (object):*
** *`master_timeout` (Optional, string | -1 | 0)*
** *`timeout` (Optional, string | -1 | 0)*
** *`master_timeout` (Optional, string | -1 | 0)*: Explicit operation timeout for connection to master node
** *`timeout` (Optional, string | -1 | 0)*: Explicit operation timeout

[discrete]
==== stop
Expand All @@ -4069,8 +4069,8 @@ client.ilm.stop({ ... })
==== Arguments

* *Request (object):*
** *`master_timeout` (Optional, string | -1 | 0)*
** *`timeout` (Optional, string | -1 | 0)*
** *`master_timeout` (Optional, string | -1 | 0)*: Explicit operation timeout for connection to master node
** *`timeout` (Optional, string | -1 | 0)*: Explicit operation timeout

[discrete]
=== indices
Expand Down Expand Up @@ -5765,7 +5765,7 @@ client.inference.put({ inference_id })
* *Request (object):*
** *`inference_id` (string)*: The inference Id
** *`task_type` (Optional, Enum("sparse_embedding" | "text_embedding" | "rerank" | "completion"))*: The task type
** *`inference_config` (Optional, { service, service_settings, task_settings })*
** *`inference_config` (Optional, { chunking_settings, service, service_settings, task_settings })*

[discrete]
==== stream_inference
Expand Down
8 changes: 8 additions & 0 deletions src/api/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12501,7 +12501,15 @@ export type InferenceDenseByteVector = byte[]

export type InferenceDenseVector = float[]

export interface InferenceInferenceChunkingSettings extends InferenceInferenceEndpoint {
max_chunk_size?: integer
overlap?: integer
sentence_overlap?: integer
strategy?: string
}

export interface InferenceInferenceEndpoint {
chunking_settings?: InferenceInferenceChunkingSettings
service: string
service_settings: InferenceServiceSettings
task_settings?: InferenceTaskSettings
Expand Down
8 changes: 8 additions & 0 deletions src/api/typesWithBodyKey.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12725,7 +12725,15 @@ export type InferenceDenseByteVector = byte[]

export type InferenceDenseVector = float[]

export interface InferenceInferenceChunkingSettings extends InferenceInferenceEndpoint {
max_chunk_size?: integer
overlap?: integer
sentence_overlap?: integer
strategy?: string
}

export interface InferenceInferenceEndpoint {
chunking_settings?: InferenceInferenceChunkingSettings
service: string
service_settings: InferenceServiceSettings
task_settings?: InferenceTaskSettings
Expand Down

0 comments on commit a003078

Please sign in to comment.