Skip to content

MCP Workbench example in tutorial goes in a loop #6534

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
angangwa opened this issue May 14, 2025 · 1 comment
Open

MCP Workbench example in tutorial goes in a loop #6534

angangwa opened this issue May 14, 2025 · 1 comment

Comments

@angangwa
Copy link

angangwa commented May 14, 2025

What happened?

Describe the bug
When running this example:

from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams

# Get the fetch tool from mcp-server-fetch.
fetch_mcp_server = StdioServerParams(command="uvx", args=["mcp-server-fetch"])

# Create an MCP workbench which provides a session to the mcp server.
async with McpWorkbench(fetch_mcp_server) as workbench:  # type: ignore
    # Create an agent that can use the fetch tool.
    model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano")
    fetch_agent = AssistantAgent(
        name="fetcher", model_client=model_client, workbench=workbench, reflect_on_tool_use=True
    )

    # Let the agent fetch the content of a URL and summarize it.
    result = await fetch_agent.run(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle")
    assert isinstance(result.messages[-1], TextMessage)
    print(result.messages[-1].content)

    # Close the connection to the model client.
    await model_client.close()

The cell goes in a never ending loop in Jupyter Notebook. It being tested on a Windows Machine. UV is installed globally as well as in the python virtual environment. Running the uvx mcp-server-fetch command works - i.e. the server starts but there is (I believe as expected) no logs.

What I've also tried

I tried replacing the command as such:

  • command=r"C:\Users\...\AppData\Local\Microsoft\WinGet\Packages\astral-sh.uv_Microsoft.Winget.Source_8wekyb3d8bbwe\uvx.exe"

Which packages was the bug in?

Python AgentChat (autogen-agentchat>=0.4.0), Python Extensions (autogen-ext)

AutoGen library version.

Python 0.5.7

Other library version.

No response

Model used

gpt-4.1

Model provider

Azure OpenAI

Other model provider

No response

Python version

3.13

.NET version

None

Operating system

Windows

@SongChiYoung
Copy link
Contributor

I think, it's same as this issue #5069
Yes, It from asyncio is not implement _make_subprocess_transport at windows.

When you take a look without jupyter notebook you can see it will be run.

from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams
from autogen_agentchat.ui import Console
import asyncio
async def test():
    # Get the fetch tool from mcp-server-fetch.
    fetch_mcp_server = StdioServerParams(command="uvx", args=["mcp-server-fetch"])

    # Create an MCP workbench which provides a session to the mcp server.
    # async with McpWorkbench(fetch_mcp_server) as workbench:  # type: ignore
    workbench = McpWorkbench(fetch_mcp_server)
    print("hello1")
    await workbench.start()
    print("hello2")

    # Create an agent that can use the fetch tool.
    print(await workbench.list_tools())
    model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano", )
    fetch_agent = AssistantAgent(
        name="fetcher", model_client=model_client, workbench=workbench, reflect_on_tool_use=True
    )

    # Let the agent fetch the content of a URL and summarize it.
    # result = await fetch_agent.run(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle")
    # assert isinstance(result.messages[-1], TextMessage)
    # print(result.messages[-1].content)
    await Console(fetch_agent.run_stream(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle"))
    # Close the connection to the model client.
    await model_client.close()
asyncio.run(test())

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants