Skip to content

[bug] SmolagentsInstrumentor does not create model spans when using CodeAgent #1636

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
igor-gorohovsky opened this issue May 15, 2025 · 1 comment
Labels
bug Something isn't working instrumentation: smolagents triage Issues that require triage

Comments

@igor-gorohovsky
Copy link

igor-gorohovsky commented May 15, 2025

Describe the bug
opentelemetry-instrumentation-smolagents instruments the model layer by monkey-patching Model.__call__.
The CodeAgent (and any logic that reuses its _step_stream / step implementation) never calls __call__; it uses model.generate(...) or model.generate_stream(...) directly.
Because of this, no model spans are produced when a CodeAgent is executed – only the agent-level and tool-level spans appear.

To Reproduce

  1. Create python script:
from smolagents import CodeAgent, tool, OpenAIServerModel

model = OpenAIServerModel(model_id="gpt-3.5-turbo")

@tool
def get_tasks() -> dict:
    """Returns list of tasks"""
    return {"title": "some work"}
agent = CodeAgent(tools=[get_tasks], model=model)

agent.run("What tasks I have?")
  1. Run script with opentelemetry-instrumentary
opentelemetry-instrumentary python3 script.py

Expected behavior
Every call to the underlying LLM (generate / generate_stream) should be wrapped in a span so that prompt, completion, latency, token counts, etc. are recorded.

Versions / Environment

  • smolagents: 1.15.0
  • opentelemetry-instrumentation-smolagents: 0.1.11
  • opentelemetry-sdk/api: 1.33.0
  • Python 3.11.12
  • OS: Ubuntu
  • Version: 24.04

Additional context
The instrumentor only wraps:

wrap_function_wrapper("smolagents", name=model_subclass.__name__ + ".__call__", ...)

but CodeAgent._step_stream uses:

chat_message = self.model.generate(...)
# or
output_stream = self.model.generate_stream(...)

Possible fix

  • Wrap Model.generate and Model.generate_stream instead of Model.__call__.
    (__call__ simply delegates to generate)
@igor-gorohovsky igor-gorohovsky added bug Something isn't working triage Issues that require triage labels May 15, 2025
Copy link

dosubot bot commented May 15, 2025

Hello @igor-gorohovsky! I'm Dosu, a bot here to help you with bugs, answer questions, and assist you in becoming a contributor. I'm ready to help while you wait for a human maintainer. Let's dive into your issue!

To address the issue where SmolagentsInstrumentor does not create model spans for CodeAgent, you can modify the instrumentor to wrap the generate and generate_stream methods instead of __call__. Here's how you can implement the fix:

  1. Update the _instrument method in the SmolagentsInstrumentor to wrap generate and generate_stream for each model subclass:
def _instrument(self, **kwargs: Any) -> None:
    import smolagents
    from smolagents import CodeAgent, MultiStepAgent, Tool, ToolCallingAgent, models

    if not (tracer_provider := kwargs.get("tracer_provider")):
        tracer_provider = trace_api.get_tracer_provider()
    if not (config := kwargs.get("config")):
        config = TraceConfig()
    else:
        assert isinstance(config, TraceConfig)
    self._tracer = OITracer(
        trace_api.get_tracer(__name__, __version__, tracer_provider),
        config=config,
    )

    run_wrapper = _RunWrapper(tracer=self._tracer)
    self._original_run_method = getattr(MultiStepAgent, "run", None)
    wrap_function_wrapper(
        module="smolagents",
        name="MultiStepAgent.run",
        wrapper=run_wrapper,
    )

    self._original_step_methods: Optional[dict[type, Optional[Callable[..., Any]]]] = {}
    step_wrapper = _StepWrapper(tracer=self._tracer)
    for step_cls in [CodeAgent, ToolCallingAgent]:
        self._original_step_methods[step_cls] = getattr(step_cls, "step", None)
        wrap_function_wrapper(
            module="smolagents",
            name=f"{step_cls.__name__}.step",
            wrapper=step_wrapper,
        )

    self._original_model_call_methods: Optional[dict[type, Callable[..., Any]]] = {}

    exported_model_subclasses = [
        attr
        for _, attr in vars(smolagents).items()
        if isinstance(attr, type) and issubclass(attr, models.Model)
    ]
    for model_subclass in exported_model_subclasses:
        model_subclass_wrapper = _ModelWrapper(tracer=self._tracer)
        self._original_model_call_methods[model_subclass] = {
            "generate": getattr(model_subclass, "generate", None),
            "generate_stream": getattr(model_subclass, "generate_stream", None)
        }
        wrap_function_wrapper(
            module="smolagents",
            name=model_subclass.__name__ + ".generate",
            wrapper=model_subclass_wrapper,
        )
        wrap_function_wrapper(
            module="smolagents",
            name=model_subclass.__name__ + ".generate_stream",
            wrapper=model_subclass_wrapper,
        )

    tool_call_wrapper = _ToolCallWrapper(tracer=self._tracer)
    self._original_tool_call_method = getattr(Tool, "__call__", None)
    wrap_function_wrapper(
        module="smolagents",
        name="Tool.__call__",
        wrapper=tool_call_wrapper,
    )

This modification ensures that the generate and generate_stream methods are wrapped, allowing spans to be created for these methods when a CodeAgent is executed [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working instrumentation: smolagents triage Issues that require triage
Projects
Status: No status
Development

No branches or pull requests

1 participant