Replies: 1 comment 3 replies
-
Heya, It is important to note perhaps that, underlying everything, an LLM never actually calls a tool in any library or framework but rather it is always the developer (you or the library dev) that implements some logic to do so. That being said, Instructor & Atomic Agents makes this fact much more explicit and takes a more developer-first approach. Please have a look at some of the examples, but I think this example will help you along the most: Here you can see an agent generating a "tool call" schema for the search tool, for example.. Some more examples that can help you: |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
It appears that Instructor is overwriting the tool array and replacing it with BaseAgentOutputSchema only. How are you guys passing tool functions to, in this case, the OpenAI API, so that they don't get overwritten?
Beta Was this translation helpful? Give feedback.
All reactions