MCP_Router_Server
A FastAPI-based MCP-compliant server that routes requests to multiple LLM providers via a unified REST API.
The MCP Router Server is a versatile and efficient server built on FastAPI, designed to facilitate seamless communication with various large language model (LLM) providers. It offers a unified REST API that allows users to switch between different LLM providers such as OpenAI, LM Studio, OpenRouter, Ollama, Claude, and Azure with ease. The server is highly configurable, enabling users to set provider keys and options through environment variables. It supports context-aware chat requests via the /ask
endpoint and includes a health check feature to ensure the server is running smoothly. The server is easy to deploy and configure, making it an ideal choice for developers looking to integrate multiple LLM services into their applications.
Features
- Switch LLM providers by editing
.env
- /ask endpoint for context-aware chat requests
- Health check at
/health
- Provider abstraction for multiple LLM services
- Easy deployment and configuration
Usage with Different Platforms
quickstart
bash
pip install -r requirements.txt
cp .env.example .env
# Edit .env to set your provider keys and options
uvicorn app.main:app --reload
curl http://localhost:8000/health
curl -X POST http://localhost:8000/ask \
-H "Content-Type: application/json" \
-d '{
"identity": { "user_id": "demo" },
"memory": { "history": [ { "role": "system", "content": "You are a helpful assistant that only answers in haikus." }, { "role": "user", "content": "who are you?" } ] },
"tools": [],
"docs": [],
"extra": {}
}'
anthropic
bash
curl -X POST http://localhost:8000/ask \
-H "Content-Type: application/json" \
-d '{
"identity": { "user_id": "demo" },
"memory": { "history": [ { "role": "user", "content": "You are a helpful assistant that only answers in haikus.\n\nwho are you?" } ] },
"tools": [],
"docs": [],
"extra": {}
}'
Related MCP Servers
View all developer_tools servers →context7
by upstash
Context7 MCP provides up-to-date, version-specific documentation and code examples directly into your prompt, enhancing the capabilities of LLMs by ensuring they use the latest information.
Sequential Thinking
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
git-mcp
by idosal
GitMCP is a free, open-source, remote Model Context Protocol (MCP) server that transforms GitHub projects into documentation hubs, enabling AI tools to access up-to-date documentation and code.
Everything MCP Server
by modelcontextprotocol
The Everything MCP Server is a comprehensive test server designed to demonstrate the full capabilities of the Model Context Protocol (MCP). It is not intended for production use but serves as a valuable tool for developers building MCP clients.
exa-mcp-server
by exa-labs
A Model Context Protocol (MCP) server allows AI assistants to use the Exa AI Search API for real-time web searches in a secure manner.
repomix
by yamadashy
Repomix is a tool that packs your codebase into AI-friendly formats, making it easier to use with AI tools like LLMs.
mcpdoc
by langchain-ai
MCP LLMS-TXT Documentation Server provides a structured way to manage and retrieve LLM documentation using the Model Context Protocol.