Aider MCP Server
A Machine Cognition Protocol (MCP) server that provides AI coding capabilities using Aider.
Table of Contents
- Overview
- Features
- Prerequisites
- Installation
- Configuration
- Running the Server
- MCP Tools
- Integration with MCP Clients
- Troubleshooting
- License
Overview
This MCP server leverages Aider, a powerful AI coding assistant, to provide coding capabilities via a standardized API. By discretely offloading work to Aider, we can reduce costs while using model-specific capabilities to create more reliable code through multiple focused LLM calls.
Features
- AI Code Generation: Run one-shot coding tasks with Aider to add, fix, or enhance code
- Model Selection: Query available models to choose the most appropriate one for your task
- Flexible Configuration: Configure Aider sessions with customizable settings
- Multi-transport Support: Run via Server-Sent Events (SSE) or stdio for flexible integration
- General Question Answering: Ask an LLM questions using a simple prompt
Prerequisites
- Python: 3.10 or higher
- Package Manager: uv (recommended) or pip
- API Keys: Depending on the models you want to use, you'll need API keys for:
- OpenAI (for GPT models)
- Anthropic (for Claude models)
- Google (for Gemini models)
Installation
-
Clone the repository:
git clone https://github.com/your-username/aider-mcp.git cd aider-mcp
-
Install the package:
uv sync uv venv uv pip install -e .
-
Run the tests to ensure everything is working:
uv run pytest
Configuration
Configure the server behavior using environment variables in a .env
file:
# Create environment file from example
cp .env.example .env
Edit the .env
file to configure transport, host, port, and API keys.
Variable | Description | Default | Required |
---|---|---|---|
TRANSPORT | Transport protocol (sse or stdio) | sse | No |
HOST | Host to bind to when using SSE transport | 0.0.0.0 | No |
PORT | Port to listen on when using SSE transport | 8050 | No |
OPENAI_API_KEY | API key for OpenAI models | * | |
ANTHROPIC_API_KEY | API key for Anthropic models | * | |
GEMINI_API_KEY | API key for Google Gemini models | * |
*Required only if using models from that provider
Command Line Options
--editor-model
: Model to use for editing (default: gemini/gemini-2.5-pro-exp-03-25)--architect-model
: Model to use for architecture planning (optional)--cwd
: Current working directory (default: current directory)
Running the Server
Start the Server (SSE Mode)
# Using the module directly
uv run python -m aider_mcp_server
Or with custom settings:
uv run python -m aider_mcp_server --editor-model "gemini/gemini-2.5-pro-exp-03-25" --cwd "/path/to/project"
You should see output similar to:
Starting server with transport: sse
Using SSE transport on 0.0.0.0:8050
Using stdio Mode
When using stdio mode, you don't need to start the server separately - the MCP client will start it automatically when configured properly (see Integration with MCP Clients).
MCP Tools
The Aider MCP server exposes the following tools:
ai_code
Run Aider to perform coding tasks.
Parameters:
ai_coding_prompt
: The prompt for the AI coding taskrelative_editable_files
: List of files that can be editedrelative_readonly_files
: (Optional) List of files that should be read-onlysettings
: (Optional) Settings for the Aider session
Example:
{
"ai_coding_prompt": "Add a function that calculates the factorial of a number",
"relative_editable_files": ["math.py"],
"settings": {
"auto_commits": false,
"use_git": false
}
}
get_models
List available Aider models filtered by substring.
Parameters:
substring
: Substring to filter models by
Example:
{
"substring": "openai"
}
ask_question
Send a prompt to an LLM and return the response.
Parameters:
prompt
: The question or statement to sendmodel
: (Optional) The LLM model to use
Example:
{
"prompt": "What is the capital of France?",
"model": "gpt-4o"
}
Integration with MCP Clients
Configure your MCP client to connect to the SSE endpoint:
{
"mcpServers": {
"aider-mcp-server": {
"transport": "sse",
"serverUrl": "http://localhost:8050/sse"
}
}
}
Stdio Integration
Configure your MCP client to run the server via stdio:
{
"mcpServers": {
"aider-mcp-server": {
"transport": "stdio",
"command": "python",
"args": ["-m", "aider_mcp_server"],
"env": {
"TRANSPORT": "stdio"
}
}
}
}
Integration with MCP Clients
Using Docker
{
"mcpServers": {
"aider-mcp-server": {
"command": "docker",
"args": [ "run", "-i", "--rm",
"--mount", "type=bind,source=<YOUR_FULL_PATH>",
"-e", "TRANSPORT=stdio",
"-e", "EDITOR_MODEL=gemini/gemini-2.5-pro-preview-03-25",
"-e", "GEMINI_API_KEY=<YOUR_API_KEY>",
"danielscholl/aider-mcp-server"
]
}
}
}