lior-ps_multi-llm-cross-check-mcp-server
0
The Multi LLM Cross-Check MCP Server allows for simultaneous cross-checking of responses from multiple large language model providers through a unified interface, enhancing efficiency by integrating with Claude Desktop. It supports asynchronous queries to major providers like OpenAI, Anthropic, Perplexity AI, and Google.
Multi LLM Cross-Check MCP Server
A Model Control Protocol (MCP) server to cross-check responses from multiple LLM providers simultaneously. Integrates with Claude Desktop to provide a unified interface for querying different LLM APIs.
Features
- Query multiple LLM providers in parallel
- Supports OpenAI (ChatGPT), Anthropic (Claude), Perplexity AI, Google (Gemini)
- Asynchronous processing for faster responses
- Easy integration with Claude Desktop
Prerequisites
- Python 3.8 or higher
- API keys for the desired LLM providers
- uv package manager
Using the MCP Server
- Automatically starts with Claude Desktop
- Use the
cross_check
tool by prompting to "cross check with other LLMs"