MCP_Router_Server

MCP_Router_Server

3.4

A FastAPI-based MCP-compliant server that routes requests to multiple LLM providers via a unified REST API.

The MCP Router Server is a versatile and efficient server built on FastAPI, designed to facilitate seamless communication with various large language model (LLM) providers. It offers a unified REST API that allows users to switch between different LLM providers such as OpenAI, LM Studio, OpenRouter, Ollama, Claude, and Azure with ease. The server is highly configurable, enabling users to set provider keys and options through environment variables. It supports context-aware chat requests via the /ask endpoint and includes a health check feature to ensure the server is running smoothly. The server is easy to deploy and configure, making it an ideal choice for developers looking to integrate multiple LLM services into their applications.

Features

  • Switch LLM providers by editing .env
  • /ask endpoint for context-aware chat requests
  • Health check at /health
  • Provider abstraction for multiple LLM services
  • Easy deployment and configuration

Usage with Different Platforms

quickstart

bash
pip install -r requirements.txt
cp .env.example .env
# Edit .env to set your provider keys and options
uvicorn app.main:app --reload
curl http://localhost:8000/health
curl -X POST http://localhost:8000/ask \
  -H "Content-Type: application/json" \
  -d '{
    "identity": { "user_id": "demo" },
    "memory": { "history": [ { "role": "system", "content": "You are a helpful assistant that only answers in haikus." }, { "role": "user", "content": "who are you?" } ] },
    "tools": [],
    "docs": [],
    "extra": {}
  }'

anthropic

bash
curl -X POST http://localhost:8000/ask \
  -H "Content-Type: application/json" \
  -d '{
    "identity": { "user_id": "demo" },
    "memory": { "history": [ { "role": "user", "content": "You are a helpful assistant that only answers in haikus.\n\nwho are you?" } ] },
    "tools": [],
    "docs": [],
    "extra": {}
  }'