Skip to content

Commit 54952ca

Browse files
authored
Merge pull request #64 from modelcontextprotocol/davidsp/readme
davidsp/readme
2 parents 42a5f6f + 9bdb8a2 commit 54952ca

File tree

2 files changed

+269
-22
lines changed

2 files changed

+269
-22
lines changed

README.md

Lines changed: 266 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,23 @@
11
# MCP Python SDK
2+
[![PyPI][pypi-badge]][pypi-url]
3+
[![MIT licensed][mit-badge]][mit-url]
4+
[![Python Version][python-badge]][python-url]
5+
[![Documentation][docs-badge]][docs-url]
6+
[![Specification][spec-badge]][spec-url]
7+
[![GitHub Discussions][discussions-badge]][discussions-url]
8+
9+
[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg
10+
[pypi-url]: https://pypi.org/project/mcp/
11+
[mit-badge]: https://img.shields.io/pypi/l/mcp.svg
12+
[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE
13+
[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg
14+
[python-url]: https://www.python.org/downloads/
15+
[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg
16+
[docs-url]: https://modelcontextprotocol.io
17+
[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg
18+
[spec-url]: https://spec.modelcontextprotocol.io
19+
[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
20+
[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions
221

322
Python implementation of the [Model Context Protocol](https://modelcontextprotocol.io) (MCP), providing both client and server capabilities for integrating with LLM surfaces.
423

@@ -13,60 +32,285 @@ The Model Context Protocol allows applications to provide context for LLMs in a
1332

1433
## Installation
1534

35+
We recommend the use of [uv](https://docs.astral.sh/uv/) to manage your Python projects:
36+
1637
```bash
1738
uv add mcp
1839
```
1940

41+
Alternatively, add mcp to your `requirements.txt`:
42+
```
43+
pip install mcp
44+
# or add to requirements.txt
45+
pip install -r requirements.txt
46+
```
47+
48+
## Overview
49+
MCP servers provide focused functionality like resources, tools, prompts, and other capabilities that can be reused across many client applications. These servers are designed to be easy to build, highly composable, and modular.
50+
51+
### Key design principles
52+
- Servers are extremely easy to build with clear, simple interfaces
53+
- Multiple servers can be composed seamlessly through a shared protocol
54+
- Each server operates in isolation and cannot access conversation context
55+
- Features can be added progressively through capability negotiation
56+
57+
### Server provided primitives
58+
- [Prompts](https://modelcontextprotocol.io/docs/concepts/prompts): Templatable text
59+
- [Resources](https://modelcontextprotocol.io/docs/concepts/resources): File-like attachments
60+
- [Tools](https://modelcontextprotocol.io/docs/concepts/tools): Functions that models can call
61+
- Utilities:
62+
- Completion: Auto-completion provider for prompt arguments or resource URI templates
63+
- Logging: Logging to the client
64+
- Pagination*: Pagination for long results
65+
66+
### Client provided primitives
67+
- [Sampling](https://modelcontextprotocol.io/docs/concepts/sampling): Allow servers to sample using client models
68+
- Roots: Information about locations to operate on (e.g., directories)
69+
70+
Connections between clients and servers are established through transports like **stdio** or **SSE** (Note that most clients support stdio, but not SSE at the moment). The transport layer handles message framing, delivery, and error handling.
71+
2072
## Quick Start
2173

74+
### Creating a Server
75+
76+
MCP servers follow a decorator approach to register handlers for MCP primitives like resources, prompts, and tools. The goal is to provide a simple interface for exposing capabilities to LLM clients.
77+
78+
```python
79+
from mcp.server import Server, NotificationOptions
80+
from mcp.server.models import InitializationOptions
81+
import mcp.server.stdio
82+
import mcp.types as types
83+
84+
# Create a server instance
85+
server = Server("example-server")
86+
87+
# Add prompt capabilities
88+
@server.list_prompts()
89+
async def handle_list_prompts() -> list[types.Prompt]:
90+
return [
91+
types.Prompt(
92+
name="example-prompt",
93+
description="An example prompt template",
94+
arguments=[
95+
types.PromptArgument(
96+
name="arg1",
97+
description="Example argument",
98+
required=True
99+
)
100+
]
101+
)
102+
]
103+
104+
@server.get_prompt()
105+
async def handle_get_prompt(
106+
name: str,
107+
arguments: dict[str, str] | None
108+
) -> types.GetPromptResult:
109+
if name != "example-prompt":
110+
raise ValueError(f"Unknown prompt: {name}")
111+
112+
return types.GetPromptResult(
113+
description="Example prompt",
114+
messages=[
115+
types.PromptMessage(
116+
role="user",
117+
content=types.TextContent(
118+
type="text",
119+
text="Example prompt text"
120+
)
121+
)
122+
]
123+
)
124+
125+
# Run the server as STDIO
126+
async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
127+
await server.run(
128+
read_stream,
129+
write_stream,
130+
InitializationOptions(
131+
server_name="example",
132+
server_version="0.1.0",
133+
capabilities=server.get_capabilities(
134+
notification_options=NotificationOptions(),
135+
experimental_capabilities={},
136+
)
137+
)
138+
)
139+
```
140+
22141
### Creating a Client
23142

24143
```python
25-
from mcp import ClientSession
144+
from mcp import ClientSession, StdioServerParameters
26145
from mcp.client.stdio import stdio_client
27146

28-
async with stdio_client(command="path/to/server") as (read, write):
147+
# Create server parameters for stdio connection
148+
server_params = StdioServerParameters(
149+
command="path/to/server",
150+
args=[], # Optional command line arguments
151+
env=None # Optional environment variables
152+
)
153+
154+
async with stdio_client(server_params) as (read, write):
29155
async with ClientSession(read, write) as session:
30156
# Initialize the connection
31157
await session.initialize()
32158

33159
# List available resources
34160
resources = await session.list_resources()
161+
162+
# List available prompts
163+
prompts = await session.list_prompts()
164+
165+
# List available tools
166+
tools = await session.list_tools()
167+
168+
# Read a resource
169+
resource = await session.read_resource("file://some/path")
170+
171+
# Call a tool
172+
result = await session.call_tool("tool-name", arguments={"arg1": "value"})
173+
174+
# Get a prompt
175+
prompt = await session.get_prompt("prompt-name", arguments={"arg1": "value"})
35176
```
36177

37-
### Creating a Server
178+
## Primitives
179+
180+
The MCP Python SDK provides decorators that map to the core protocol primitives. Each primitive follows a different interaction pattern based on how it is controlled and used:
181+
182+
| Primitive | Control | Description | Example Use |
183+
|-----------|-----------------------|-----------------------------------------------------|------------------------------|
184+
| Prompts | User-controlled | Interactive templates invoked by user choice | Slash commands, menu options |
185+
| Resources | Application-controlled| Contextual data managed by the client application | File contents, API responses |
186+
| Tools | Model-controlled | Functions exposed to the LLM to take actions | API calls, data updates |
187+
188+
### User-Controlled Primitives
189+
190+
**Prompts** are designed to be explicitly selected by users for their interactions with LLMs.
191+
192+
| Decorator | Description |
193+
|--------------------------|----------------------------------------|
194+
| `@server.list_prompts()` | List available prompt templates |
195+
| `@server.get_prompt()` | Get a specific prompt with arguments |
196+
197+
### Application-Controlled Primitives
198+
199+
**Resources** are controlled by the client application, which decides how and when they should be used based on its own logic.
200+
201+
| Decorator | Description |
202+
|--------------------------------|---------------------------------------|
203+
| `@server.list_resources()` | List available resources |
204+
| `@server.read_resource()` | Read a specific resource's content |
205+
| `@server.subscribe_resource()` | Subscribe to resource updates |
206+
207+
### Model-Controlled Primitives
208+
209+
**Tools** are exposed to LLMs to enable automated actions, with user approval.
210+
211+
| Decorator | Description |
212+
|------------------------|------------------------------------|
213+
| `@server.list_tools()` | List available tools |
214+
| `@server.call_tool()` | Execute a tool with arguments |
215+
216+
### Server Management
217+
218+
Additional decorators for server functionality:
219+
220+
| Decorator | Description |
221+
|-------------------------------|--------------------------------|
222+
| `@server.set_logging_level()` | Update server logging level |
223+
224+
### Capabilities
225+
226+
MCP servers declare capabilities during initialization. These map to specific decorators:
227+
228+
| Capability | Feature Flag | Decorators | Description |
229+
|-------------|------------------------------|-----------------------------------------------------------------|-------------------------------------|
230+
| `prompts` | `listChanged` | `@list_prompts`<br/>`@get_prompt` | Prompt template management |
231+
| `resources` | `subscribe`<br/>`listChanged`| `@list_resources`<br/>`@read_resource`<br/>`@subscribe_resource`| Resource exposure and updates |
232+
| `tools` | `listChanged` | `@list_tools`<br/>`@call_tool` | Tool discovery and execution |
233+
| `logging` | - | `@set_logging_level` | Server logging configuration |
234+
| `completion`| - | `@complete_argument` | Argument completion suggestions |
235+
236+
Capabilities are negotiated during connection initialization. Servers only need to implement the decorators for capabilities they support.
237+
238+
## Client Interaction
239+
240+
The MCP Python SDK enables servers to interact with clients through request context and session management. This allows servers to perform operations like LLM sampling and progress tracking.
241+
242+
### Request Context
243+
244+
The Request Context provides access to the current request and client session. It can be accessed through `server.request_context` and enables:
245+
246+
- Sampling from the client's LLM
247+
- Sending progress updates
248+
- Logging messages
249+
- Accessing request metadata
250+
251+
Example using request context for LLM sampling:
38252

39253
```python
40-
from mcp.server import Server
41-
from mcp.server.stdio import stdio_server
254+
@server.call_tool()
255+
async def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:
256+
# Access the current request context
257+
context = server.request_context
42258

43-
# Create a server instance
44-
server = Server("example-server")
259+
# Use the session to sample from the client's LLM
260+
result = await context.session.create_message(
261+
messages=[
262+
types.SamplingMessage(
263+
role="user",
264+
content=types.TextContent(
265+
type="text",
266+
text="Analyze this data: " + json.dumps(arguments)
267+
)
268+
)
269+
],
270+
max_tokens=100
271+
)
45272

46-
# Add capabilities
47-
@server.list_resources()
48-
async def list_resources():
49-
return [
50-
{
51-
"uri": "file:///example.txt",
52-
"name": "Example Resource"
53-
}
54-
]
273+
return [types.TextContent(type="text", text=result.content.text)]
274+
```
275+
276+
Using request context for progress updates:
277+
278+
```python
279+
@server.call_tool()
280+
async def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:
281+
context = server.request_context
55282

56-
# Run the server
57-
async with stdio_server() as (read, write):
58-
await server.run(read, write, server.create_initialization_options())
283+
if progress_token := context.meta.progressToken:
284+
# Send progress notifications
285+
await context.session.send_progress_notification(
286+
progress_token=progress_token,
287+
progress=0.5,
288+
total=1.0
289+
)
290+
291+
# Perform operation...
292+
293+
if progress_token:
294+
await context.session.send_progress_notification(
295+
progress_token=progress_token,
296+
progress=1.0,
297+
total=1.0
298+
)
299+
300+
return [types.TextContent(type="text", text="Operation complete")]
59301
```
60302

303+
The request context is automatically set for each request and provides a safe way to access the current client session and request metadata.
304+
61305
## Documentation
62306

63307
- [Model Context Protocol documentation](https://modelcontextprotocol.io)
64-
- [MCP Specification](https://spec.modelcontextprotocol.io)
65-
- [Example Servers](https://github.com/modelcontextprotocol/servers)
308+
- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)
309+
- [Officially supported servers](https://github.com/modelcontextprotocol/servers)
66310

67311
## Contributing
68312

69-
Issues and pull requests are welcome on GitHub at https://github.com/modelcontextprotocol/python-sdk.
313+
We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.
70314

71315
## License
72316

pyproject.toml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,9 @@ classifiers = [
2121
"License :: OSI Approved :: MIT License",
2222
"Programming Language :: Python :: 3",
2323
"Programming Language :: Python :: 3.10",
24+
"Programming Language :: Python :: 3.11",
25+
"Programming Language :: Python :: 3.12",
26+
"Programming Language :: Python :: 3.13",
2427
]
2528
dependencies = [
2629
"anyio>=4.6",

0 commit comments

Comments
 (0)