mcp-server-ollama-deep-researcher
The Ollama Deep Researcher is an MCP server that aids AI in performing deep research on topics using local LLMs via Ollama. It is designed for comprehensive research workflows, providing tools for generating and refining summaries based on web searches.
MCP Server: Ollama Deep Researcher
This project is a Model Context Protocol (MCP) server that adapts the LangChain Ollama Deep Researcher. It offers deep research capabilities for AI assistants to conduct in-depth research using local LLMs with Ollama. The server generates search queries, retrieves search results with Tavily or Perplexity API, summarizes results, identifies knowledge gaps, and iteratively refines the summary, providing a final markdown summary with sourced information.
Core Functionality
- Web search query generation and result summarization
- Iterative research process to refine summaries
- Markdown summary with all sources
Prerequisites
- Node.js, Python 3.10+, and sufficient compute power
- API keys for Tavily, Perplexity, and LangSmith
Installation
Options for standard or Docker installations are available, with steps detailed for cloning the repository and setting up dependencies.
Client Configuration
Instructions provided for integrating the server with Claude Desktop App or Cline VS Code Extension.
Tracing and Monitoring
Integration with LangSmith for tracing, monitoring, and debugging research workflows.
MCP Resources
Research outputs are stored as MCP resources for persistent access and streamlined context management.
Available Tools
- Set research parameters, conduct research, and check research status.
Troubleshooting
Guidance on resolving common issues with connection, API keys, Docker, and build processes.