multi-llm-cross-check-mcp-server
7
The Multi LLM Cross-Check MCP Server is designed to facilitate cross-checking responses from multiple LLM providers simultaneously through a single interface. It supports parallel querying with providers such as OpenAI, Anthropic, Perplexity AI, and Google, and is integrated with Claude Desktop for streamlined usage.
Multi LLM Cross-Check MCP Server
A Model Control Protocol (MCP) server that allows cross-checking responses from multiple LLM providers simultaneously. This server integrates with Claude Desktop to provide a unified interface for querying different LLM APIs.
Features
- Query multiple LLM providers in parallel
- Currently supports:
- OpenAI (ChatGPT)
- Anthropic (Claude)
- Perplexity AI
- Google (Gemini)
- Asynchronous parallel processing for faster responses
- Easy integration with Claude Desktop
Prerequisites
- Python 3.8 or higher
- API keys for the LLM providers you want to use
- uv package manager
Using the MCP Server
- The server automatically starts with Claude Desktop
- Use the
cross_check
tool by asking to "cross check with other LLMs" - Provide a prompt to receive responses from all configured LLM providers
Error Handling
- Providers without an API key are skipped
- API errors are caught and returned in the response
- Each LLM's response is independent.