You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -268,12 +268,12 @@ The options are as follows:
268
268
|`llm.WithTopK(uint64)`| Yes | Yes | No | - | Reduces the probability of generating nonsense. A higher value (e.g. 100) will give more diverse answers, while a lower value (e.g. 10) will be more conservative. |
269
269
|`llm.WithMaxTokens(uint64)`| No | Yes | Yes | - | The maximum number of tokens to generate in the response. |
270
270
|`llm.WithStream(func(llm.Completion))`| Can be enabled when tools are not used | Yes | Yes | - | Stream the response to a function. |
271
-
|`llm.WithToolChoice(string, string, ...)`| No |Yes| Use `auto`, `any`, `none`, `required` or a function name. Only the first argument is used. | - | The tool to use for the model. |
271
+
|`llm.WithToolChoice(string, string, ...)`| No |Use `auto`, `any` or a function name. Only the first argument is used.| Use `auto`, `any`, `none`, `required` or a function name. Only the first argument is used. | - | The tool to use for the model. |
272
272
|`llm.WithToolKit(llm.ToolKit)`| Cannot be combined with streaming | Yes | Yes | - | The set of tools to use. |
273
273
|`llm.WithStopSequence(string, string, ...)`| Yes | Yes | Yes | - | Stop generation if one of these tokens is detected. |
274
274
|`llm.WithSystemPrompt(string)`| No | Yes | Yes | - | Set the system prompt for the model. |
275
-
|`llm.WithSeed(uint64)`| Yes |Yes| Yes | - | The seed to use for random sampling. If set, different calls will generate deterministic results. |
276
-
|`llm.WithFormat(string)`| Use `json`|Yes| Use `json_format` or `text`| - | The format of the response. For Mistral, you must also instruct the model to produce JSON yourself with a system or a user message. |
275
+
|`llm.WithSeed(uint64)`| Yes |No| Yes | - | The seed to use for random sampling. If set, different calls will generate deterministic results. |
276
+
|`llm.WithFormat(string)`| Use `json`|No| Use `json_format` or `text`| - | The format of the response. For Mistral, you must also instruct the model to produce JSON yourself with a system or a user message. |
277
277
|`llm.WithPresencePenalty(float64)`| Yes | No | Yes | - | Determines how much the model penalizes the repetition of words or phrases. A higher presence penalty encourages the model to use a wider variety of words and phrases, making the output more diverse and creative. |
278
278
|`llm.WithFequencyPenalty(float64)`| Yes | No | Yes | - | Penalizes the repetition of words based on their frequency in the generated text. A higher frequency penalty discourages the model from repeating words that have already appeared frequently in the output, promoting diversity and reducing repetition. |
279
279
|`mistral.WithPrediction(string)`| No | No | Yes | - | Enable users to specify expected results, optimizing response times by leveraging known or predictable content. This approach is especially effective for updating text documents or code files with minimal changes, reducing latency while maintaining high-quality results. |
@@ -282,7 +282,7 @@ The options are as follows:
282
282
|`llm.WithAttachment(io.Reader)`| Yes | Yes | Yes | - | Attach a file to a user prompt. It is the responsibility of the caller to close the reader. |
283
283
|`antropic.WithEphemeral()`| No | Yes | No | - | Attachments should be cached server-side |
284
284
|`antropic.WithCitations()`| No | Yes | No | - | Attachments should be used in citations |
285
-
|`antropic.WithUser(string)`| No | Yes | No | - | Indicate the user name for the request, for debugging |
285
+
|`antropic.WithUser(string)`| No | Yes | No | - | Indicate the user for the request, for debugging |
0 commit comments