Python FastAPI to get input via userproxy #6177
Replies: 2 comments 1 reply
-
Have you tried termination condition? |
Beta Was this translation helpful? Give feedback.
-
Like @ekzhu mentioned in the post above, there are several ways to integrate a human in the loop Feedback during runThis seems to be what you are trying to accomplish - using a user proxy. Can you confirm that what you are trying to accomplish is similar to what is done in AGS for user input? ags_update.mp4
This is occurring because your user_proxy is using the default input_func which is console. For your fastapi app, what you need to do is to provide a custom input function to the user_proxy, that function is then connected to your socket and the UI can use that for user input. You can look at the implementation that AGS uses to accomplish this - it is a bit complex, but let us know if you have any issues.
Feedback to the Next RunHere, the team runs until termination, the application or user provides feedback, and the team runs again with the feedback. Your app logic will need to terminate, save state, get user feedback and then load state and run again. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Describe the bug
I am writing an API using Fast API, that initialize the Selector Group Chat of assistants and user proxy agents.
Through streaming, I am able to stream the conversation however during user proxy's turn where it expects user's response is expecting the input in the python console where the app is being run. It is kind of going into a blocked state and to resume, I need to provide user input to the user proxy agent from the python console. As soon as I do that, it resumes the team chat and complete streaming the whole interaction. There is no direct way of carrying user proxy input via same streaming.
Expected behavior
There could be a way to enable interaction of the user input through the streaming API itself.
Which packages was the bug in?
Python AgentChat (autogen-agentchat>=0.4.0)
AutoGen library version.
Python dev (main branch)
Other library version.
No response
Model used
No response
Model provider
None
Other model provider
No response
Python version
3.11
.NET version
None
Operating system
Windows
Beta Was this translation helpful? Give feedback.
All reactions