Simple-MCP-Server-with-Python
The Model Context Protocol (MCP) is a standardized way to supply context to large language models (LLMs).
The Model Context Protocol (MCP) standardizes the interface between applications and LLMs, allowing for the separation of concerns in providing context, executing code, and managing user interactions. The MCP Python SDK implements the full MCP specification, enabling developers to expose resources, define tools, and create prompts for LLM applications. This tutorial guides you through building a simple MCP server in Python, which exposes data, functionality, and interaction templates to LLMs in a secure and modular fashion. The server can advertise its capabilities, allowing clients to adapt dynamically. The tutorial covers setting up the environment, creating the server, and testing it using the MCP Inspector.
Features
- Expose Resources: Deliver data to LLMs similar to GET endpoints.
- Define Tools: Provide functionality that performs actions or computations like POST endpoints.
- Create Prompts: Offer reusable, templated interactions.
- Dynamic Capability Advertisement: Clients can adapt based on server's advertised features.
- MCP Inspector: A web-based UI for testing and interacting with MCP servers.
MCP Tools
- {'add': 'A tool that adds two numbers together.'}
MCP Resources
- {'greeting': 'A resource that returns a personalized greeting.'}
Usage with Different Platforms
Ubuntu 22.04
bash
sudo add-apt-repository ppa:deadsnakes/ppa -y
sudo apt update
sudo apt install -y python3.11 python3.11-venv python3.11-distutils python3-apt
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.10 1
sudo update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.11 2
sudo update-alternatives --config python3
curl -sS https://bootstrap.pypa.io/get-pip.py | sudo python3.11
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get update
sudo apt-get install -y nodejs
python3 -m venv .venv
source .venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
Python Script
python
# client.py
import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
async def main():
server_params = StdioServerParameters(
command="python",
args=["server.py"],
)
async with stdio_client(server_params) as (reader, writer):
async with ClientSession(reader, writer) as session:
await session.initialize()
result = await session.call_tool("add", arguments={"a": 3, "b": 4})
print(f"Result of add tool: {result}")
if __name__ == "__main__":
asyncio.run(main())
Related MCP Servers
View all developer_tools servers →context7
by upstash
Context7 MCP provides up-to-date, version-specific documentation and code examples directly into your prompt, enhancing the capabilities of LLMs by ensuring they use the latest information.
Sequential Thinking
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
git-mcp
by idosal
GitMCP is a free, open-source, remote Model Context Protocol (MCP) server that transforms GitHub projects into documentation hubs, enabling AI tools to access up-to-date documentation and code.
Everything MCP Server
by modelcontextprotocol
The Everything MCP Server is a comprehensive test server designed to demonstrate the full capabilities of the Model Context Protocol (MCP). It is not intended for production use but serves as a valuable tool for developers building MCP clients.
exa-mcp-server
by exa-labs
A Model Context Protocol (MCP) server allows AI assistants to use the Exa AI Search API for real-time web searches in a secure manner.
repomix
by yamadashy
Repomix is a tool that packs your codebase into AI-friendly formats, making it easier to use with AI tools like LLMs.
mcpdoc
by langchain-ai
MCP LLMS-TXT Documentation Server provides a structured way to manage and retrieve LLM documentation using the Model Context Protocol.