Skip to content

Commit d91528f

Browse files
committed
feat: Add output schema generation for tools and update documentation
1 parent 6353dd1 commit d91528f

File tree

5 files changed

+279
-10
lines changed

5 files changed

+279
-10
lines changed

README.md

Lines changed: 70 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ The Model Context Protocol allows applications to provide context for LLMs in a
7373

7474
### Adding MCP to your python project
7575

76-
We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects.
76+
We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects.
7777

7878
If you haven't created a uv-managed project yet, create one:
7979

@@ -89,6 +89,7 @@ If you haven't created a uv-managed project yet, create one:
8989
```
9090

9191
Alternatively, for projects using pip for dependencies:
92+
9293
```bash
9394
pip install "mcp[cli]"
9495
```
@@ -128,11 +129,13 @@ def get_greeting(name: str) -> str:
128129
```
129130

130131
You can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:
132+
131133
```bash
132134
mcp install server.py
133135
```
134136

135137
Alternatively, you can test it with the MCP Inspector:
138+
136139
```bash
137140
mcp dev server.py
138141
```
@@ -245,6 +248,67 @@ async def fetch_weather(city: str) -> str:
245248
return response.text
246249
```
247250

251+
#### Output Schemas
252+
253+
Tools automatically generate JSON Schema definitions for their return types, helping LLMs understand the structure of the data they'll receive:
254+
255+
```python
256+
from pydantic import BaseModel
257+
258+
# Tools with primitive return types
259+
@mcp.tool()
260+
def get_temperature(city: str) -> float:
261+
"""Get the current temperature for a city"""
262+
# In a real implementation, this would fetch actual weather data
263+
return 72.5
264+
265+
# Tools with dictionary return types
266+
@mcp.tool()
267+
def get_user(user_id: int) -> dict:
268+
"""Get user information by ID"""
269+
return {"id": user_id, "name": "John Doe", "email": "john@example.com"}
270+
271+
# Using Pydantic models for structured output
272+
class WeatherData(BaseModel):
273+
temperature: float
274+
humidity: float
275+
conditions: str
276+
277+
@mcp.tool()
278+
def get_weather_data(city: str) -> WeatherData:
279+
"""Get structured weather data for a city"""
280+
# In a real implementation, this would fetch actual weather data
281+
return WeatherData(
282+
temperature=72.5,
283+
humidity=65.0,
284+
conditions="Partly cloudy"
285+
)
286+
287+
# Complex nested models
288+
class Location(BaseModel):
289+
city: str
290+
country: str
291+
coordinates: tuple[float, float]
292+
293+
class WeatherForecast(BaseModel):
294+
current: WeatherData
295+
location: Location
296+
forecast: list[WeatherData]
297+
298+
@mcp.tool()
299+
def get_weather_forecast(city: str) -> WeatherForecast:
300+
"""Get detailed weather forecast for a city"""
301+
# In a real implementation, this would fetch actual forecast data
302+
return WeatherForecast(
303+
current=WeatherData(temperature=72.5, humidity=65.0, conditions="Partly cloudy"),
304+
location=Location(city=city, country="USA", coordinates=(37.7749, -122.4194)),
305+
forecast=[
306+
WeatherData(temperature=75.0, humidity=62.0, conditions="Sunny"),
307+
WeatherData(temperature=68.0, humidity=80.0, conditions="Rainy")
308+
]
309+
)
310+
```
311+
248312
### Prompts
249313

250314
Prompts are reusable templates that help LLMs interact with your server effectively:
@@ -381,6 +445,7 @@ if __name__ == "__main__":
381445
```
382446

383447
Run it with:
448+
384449
```bash
385450
python server.py
386451
# or
@@ -458,18 +523,17 @@ app.mount("/math", math.mcp.streamable_http_app())
458523
```
459524

460525
For low level server with Streamable HTTP implementations, see:
526+
461527
- Stateful server: [`examples/servers/simple-streamablehttp/`](examples/servers/simple-streamablehttp/)
462528
- Stateless server: [`examples/servers/simple-streamablehttp-stateless/`](examples/servers/simple-streamablehttp-stateless/)
463529

464-
465-
466530
The streamable HTTP transport supports:
531+
467532
- Stateful and stateless operation modes
468533
- Resumability with event stores
469-
- JSON or SSE response formats
534+
- JSON or SSE response formats
470535
- Better scalability for multi-node deployments
471536

472-
473537
### Mounting to an Existing ASGI Server
474538

475539
> **Note**: SSE transport is being superseded by [Streamable HTTP transport](https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http).
@@ -637,6 +701,7 @@ async def query_db(name: str, arguments: dict) -> list:
637701
```
638702

639703
The lifespan API provides:
704+
640705
- A way to initialize resources when the server starts and clean them up when it stops
641706
- Access to initialized resources through the request context in handlers
642707
- Type-safe context passing between lifespan and request handlers

src/mcp/server/fastmcp/server.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -253,6 +253,7 @@ async def list_tools(self) -> list[MCPTool]:
253253
name=info.name,
254254
description=info.description,
255255
inputSchema=info.parameters,
256+
outputSchema=info.output_schema,
256257
annotations=info.annotations,
257258
)
258259
for info in tools

src/mcp/server/fastmcp/tools/base.py

Lines changed: 37 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,9 @@ class Tool(BaseModel):
2323
name: str = Field(description="Name of the tool")
2424
description: str = Field(description="Description of what the tool does")
2525
parameters: dict[str, Any] = Field(description="JSON schema for tool parameters")
26+
output_schema: dict[str, Any] | None = Field(
27+
None, description="JSON schema for tool output format"
28+
)
2629
fn_metadata: FuncMetadata = Field(
2730
description="Metadata about the function including a pydantic model for tool"
2831
" arguments"
@@ -45,6 +48,8 @@ def from_function(
4548
annotations: ToolAnnotations | None = None,
4649
) -> Tool:
4750
"""Create a Tool from a function."""
51+
from pydantic import TypeAdapter
52+
4853
from mcp.server.fastmcp.server import Context
4954

5055
func_name = name or fn.__name__
@@ -70,6 +75,32 @@ def from_function(
7075
)
7176
parameters = func_arg_metadata.arg_model.model_json_schema()
7277

78+
# Generate output schema from return type annotation if possible
79+
output_schema = None
80+
sig = inspect.signature(fn)
81+
if sig.return_annotation != inspect.Signature.empty:
82+
try:
83+
# Handle common return types that don't need schema
84+
if sig.return_annotation is str:
85+
output_schema = {"type": "string"}
86+
elif sig.return_annotation is int:
87+
output_schema = {"type": "integer"}
88+
elif sig.return_annotation is float:
89+
output_schema = {"type": "number"}
90+
elif sig.return_annotation is bool:
91+
output_schema = {"type": "boolean"}
92+
elif sig.return_annotation is dict:
93+
output_schema = {"type": "object"}
94+
elif sig.return_annotation is list:
95+
output_schema = {"type": "array"}
96+
else:
97+
# Try to generate schema using TypeAdapter
98+
return_type_adapter = TypeAdapter(sig.return_annotation)
99+
output_schema = return_type_adapter.json_schema()
100+
except Exception:
101+
# If we can't generate a schema, we'll leave it as None
102+
pass
103+
73104
return cls(
74105
fn=fn,
75106
name=func_name,
@@ -79,6 +110,7 @@ def from_function(
79110
is_async=is_async,
80111
context_kwarg=context_kwarg,
81112
annotations=annotations,
113+
output_schema=output_schema,
82114
)
83115

84116
async def run(
@@ -92,9 +124,11 @@ async def run(
92124
self.fn,
93125
self.is_async,
94126
arguments,
95-
{self.context_kwarg: context}
96-
if self.context_kwarg is not None
97-
else None,
127+
(
128+
{self.context_kwarg: context}
129+
if self.context_kwarg is not None
130+
else None
131+
),
98132
)
99133
except Exception as e:
100134
raise ToolError(f"Error executing tool {self.name}: {e}") from e

src/mcp/types.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -338,7 +338,7 @@ class ProgressNotificationParams(NotificationParams):
338338
"""
339339
total: float | None = None
340340
"""
341-
Message related to progress. This should provide relevant human readable
341+
Message related to progress. This should provide relevant human readable
342342
progress information.
343343
"""
344344
message: str | None = None
@@ -741,7 +741,7 @@ class ToolAnnotations(BaseModel):
741741

742742
idempotentHint: bool | None = None
743743
"""
744-
If true, calling the tool repeatedly with the same arguments
744+
If true, calling the tool repeatedly with the same arguments
745745
will have no additional effect on the its environment.
746746
(This property is meaningful only when `readOnlyHint == false`)
747747
Default: false
@@ -767,6 +767,8 @@ class Tool(BaseModel):
767767
"""A human-readable description of the tool."""
768768
inputSchema: dict[str, Any]
769769
"""A JSON Schema object defining the expected parameters for the tool."""
770+
outputSchema: dict[str, Any] | None = None
771+
"""A JSON Schema object defining the expected output format of the tool."""
770772
annotations: ToolAnnotations | None = None
771773
"""Optional additional tool information."""
772774
model_config = ConfigDict(extra="allow")

0 commit comments

Comments
 (0)