multi-ai-advisor-mcp

multi-ai-advisor-mcp

45

The Multi-Model Advisor is an MCP server designed to query multiple Ollama AI models, combining their responses to offer diverse perspectives. It is integrated with Claude for Desktop and allows configuration and customization to provide comprehensive answers to questions.

What are the prerequisites for using the Multi-Model Advisor?

You need Node.js 16.x or higher, Ollama installed and running, and Claude for Desktop for the complete advisory experience.

How do I configure the Multi-Model Advisor?

Create a .env file in the project root with your desired configuration, including server and Ollama settings.

What should I do if the server can't connect to Ollama?

Ensure Ollama is running, check the OLLAMA_API_URL in your .env file, and verify Ollama is responding by accessing http://localhost:11434.

How can I see which Ollama models are available?

Use the 'list-available-models' tool to display all installed Ollama models and indicate which ones are configured as defaults.

What if Claude doesn't show MCP tools?

Ensure you've restarted Claude after updating the configuration, check the absolute path in claude_desktop_config.json, and look at Claude's logs for error messages.