mcp-server
The mcp-server is designed to explore and understand the usage of MCP servers within LLM environments. It features tool management through environment variables and provides a variety of local and web-based tools. A sample chat application is included for practical demonstrations.
mcp-server
MCP server for experimenting with LLM tools
This has been created to get an understanding of MCP servers, the protocol, and usage within LLMs. It is not intended for reuse!
Dependencies
- Install 'uv'
- run
uv sync
Unit tests
uv run pytest
Launch the server
Locally/stdio
uv run mcp dev server.py
(.venv) ➜ mcp-server git:(main) ✗ uv run mcp dev server.py
Starting MCP inspector...
Proxy server listening on port 3000
🔍 MCP Inspector is up and running at http://localhost:5173 🚀
As a network server
This allows connection to the MCP server from a remote client
uv run fastmcp run server.py --transport sse```
The server will be available at http://localhost:8000
Tool Management via Environment Variables
You can control which tools are enabled by setting specific environment variables before running server.py
. This method works consistently across all launch methods, including when using wrappers like mcp dev
or fastmcp run
.
-
MCP_ENABLE_TOOLS="<patterns>"
:- Only enable tools whose module names match one of the comma-separated patterns.
- Supports glob-style wildcards (
*
,?
). - For tools in the
tools/
directory following thetool_<name>.py
convention, you can use either the full module name (e.g.,tool_add
) or the base name (e.g.,add
) in your patterns. - Example:
export MCP_ENABLE_TOOLS="add,echo*,tool_cbom_*"
-
MCP_DISABLE_TOOLS="<patterns>"
:- Enable all tools except those whose module names match one of the comma-separated patterns.
- Supports glob-style wildcards (
*
,?
). - For tools in the
tools/
directory following thetool_<name>.py
convention, you can use either the full module name (e.g.,tool_calculator
) or the base name (e.g.,calculator
) in your patterns. - Example:
export MCP_DISABLE_TOOLS="calculator,*search"
Note: The MCP_ENABLE_TOOLS
and MCP_DISABLE_TOOLS
environment variables are mutually exclusive. If both are set, the server will print an error and exit. Remember to unset
these variables when you want to revert to the default behavior or change the configuration.
View the tools
Available Tools
Tool | Description | Backend Service | Required Configuration |
---|---|---|---|
add | Simple addition tool | Local computation | None |
calculator | Evaluates mathematical expressions | Local computation | None |
calculate_bmi | Calculates Body Mass Index | Local computation | None |
echo | Returns input text unchanged | Local computation | None |
long_task | Processes files with progress tracking | Local file system | None |
duckduckgo_search | Web search using DuckDuckGo | DuckDuckGo HTML endpoint | None |
wikipedia_search | Searches Wikipedia articles | Wikipedia API | None |
fetch_weather | Gets current weather by location | OpenWeatherMap API | OPENWEATHER_API_KEY |
openmeteo_forecast | Gets detailed weather forecasts | Open-Meteo API | None |
news_search | Searches for recent news articles | NewsAPI | NEWSAPI_KEY |
tavily_search | AI-powered web search | Tavily API | TAVILY_API_KEY |
arxiv_search | Searches academic papers | arXiv API | None |
github_get_file | Retrieves file contents from GitHub | GitHub API | GITHUB_PERSONAL_ACCESS_TOKEN |
github_list_issues | Lists issues in a repository | GitHub API | GITHUB_PERSONAL_ACCESS_TOKEN |
github_create_issue | Creates a new issue in a repository | GitHub API | GITHUB_PERSONAL_ACCESS_TOKEN |
github_list_pull_requests | Lists PRs in a repository | GitHub API | GITHUB_PERSONAL_ACCESS_TOKEN |
github_search_code | Searches code on GitHub | GitHub API | GITHUB_PERSONAL_ACCESS_TOKEN |
github_user_activity | Gets a user's GitHub activity summary | GitHub API | GITHUB_PERSONAL_ACCESS_TOKEN |
create_thumbnail | Creates image thumbnails | Local image processing | None |
Wrapping stdio tools
To wrap a standard input/output (stdio) tool as an MCP server, use the stdio_mcp_wrapper.py
script:
python stdio_mcp_wrapper.py --executable <path_to_executable> [options]
Required argument:
--executable <path>
: Path to the stdio executable you want to wrap.
Options:
--args <arg1> <arg2> ...
: Arguments to pass to the executable (default: none).--tool-name <name>
: Name of the MCP tool (default:stdio_tool
).--tool-description <description>
: Description of the tool (default:Executes a wrapped stdio command.
).--port <port>
: Port for the MCP server to listen on (default: 3001).--env <KEY=VALUE>
: Set an environment variable for the executable. Can be used multiple times. If not used, the tool inherits the environment of thestdio_mcp_wrapper.py
process.--timeout <seconds>
: Timeout in seconds for the executable (default: 30).
Environment Variable Configuration
To use tools that require API keys, add the following to your environment:
# Weather services
export OPENWEATHER_API_KEY="your_openweather_api_key"
# News services
export NEWSAPI_KEY="your_newsapi_key"
# Search services
export TAVILY_API_KEY="your_tavily_api_key"
# GitHub tools
export GITHUB_PERSONAL_ACCESS_TOKEN="your_github_personal_access_token"
Sample Chat Application
The repository includes a sample chat application that demonstrates how to use MCP tools with the Ollama LLM service.
Prerequisites
- Install Ollama from https://ollama.ai/
- Pull the granite model: ollama pull granite3.2:latest (or use any other model)
- Install additional dependencies: uv pip install litellm colorama python-dotenv httpx
Configuration
Create a .env file in the project root with your configuration:
# Ollama configuration
OLLAMA_SERVER=http://localhost:11434
OLLAMA_MODEL=granite3.2:latest # Change to any model you have pulled
# MCP server endpoint (default is localhost:3000)
MCP_ENDPOINT=localhost:3000
# Logging configuration
LOG_FILE=chat_interactions.log
# API keys for various services
OPENWEATHER_API_KEY=your_api_key_here
NEWSAPI_KEY=your_api_key_here
TAVILY_API_KEY=your_api_key_here
GITHUB_PERSONAL_ACCESS_TOKEN=your_token_here
Launch the Chat Application
First, start the MCP server in one terminal:
uv run mcp dev server.py
Then, run the chat application in another terminal:
python clients/run_chat.py
Interact with the LLM, which now has access to all the tools provided by the MCP server.
Features
- The chat application automatically uses the MCP tools when appropriate
- All interactions are logged to the file specified in LOG_FILE
- Tools will be called when the LLM decides they're needed to answer a question
- Tool parameters are automatically populated based on the LLM's understanding of the query
Caveats
- It doesn't yet work with the default model.... work in progress!
Wrapping stdio tools
To wrap a standard input/output (stdio) tool as an MCP server, use the wrapper/stdio_mcp_wrapper.py
script:
python stdio_mcp_wrapper.py --executable <path_to_executable> [options]
Required argument:
--executable <path>: Path to the stdio executable you want to wrap.
Options:
--args <arg1> <arg2> ...: Arguments to pass to the executable (default: none).
--tool-name <name>: Name of the MCP tool (default: stdio_tool).
--tool-description <description>: Description of the tool (default: Executes a wrapped stdio command.).
--port <port>: Port for the MCP server to listen on (default: 3001).
--env <KEY=VALUE>: Set an environment variable for the executable. Can be used multiple times. If not used, the tool inherits the environment of the stdio_mcp_wrapper.py process.
--timeout <seconds>: Timeout in seconds for the executable (default: 30).