omnillm-mcp
1
OmniLLM serves as an MCP server that seamlessly integrates Claude with multiple large language models, providing a unified querying interface. It supports interaction with services like OpenAI, Azure, and Google, allowing for comparative analysis and configuration checks.
OmniLLM: Universal LLM Bridge for Claude
OmniLLM is an MCP server facilitating Claude's integration with various large language models like ChatGPT, Azure OpenAI, and Google Gemini. It offers the following features:
- Query different LLMs including OpenAI, Azure, and Google.
- Compare responses from multiple LLMs.
- Check configured and available LLM services.
Features
- Query OpenAI's ChatGPT models
- Query Azure OpenAI services
- Query Google's Gemini models
- Get responses from all LLMs for comparison
- Check which LLM services are configured and available
Setup Instructions
Prerequisites
- Python 3.10+
- Claude Desktop application
- API keys for the LLMs
Installation
- Clone the repository and set up a virtual environment.
- Install necessary Python dependencies.
Configuration
Create a .env
file with your API keys.
Integrating with Claude Desktop
Configure the Claude Desktop application to connect with OmniLLM.
Usage Examples
Use OmniLLM to enhance responses in Claude Desktop by querying multiple LLMs based on prompts.