mcp-server-ollama-deep-researcher
The Ollama Deep Researcher is an MCP server that aids AI in performing deep research on topics using local LLMs via Ollama. It is designed for comprehensive research workflows, providing tools for generating and refining summaries based on web searches.
What are the prerequisites for running the Ollama Deep Researcher?
You need Node.js, Python 3.10 or higher, a capable compute environment, and API keys for Tavily, Perplexity, and LangSmith.
How does the iterative research process work?
The process involves generating search queries, gathering results, summarizing, identifying gaps, and iteratively improving the summary.
Can I use Docker to run the MCP server?
Yes, Docker can be used to simplify the setup process and run the MCP server.
What APIs are supported for web search?
The server supports Tavily and Perplexity APIs for conducting web searches.
How is research data stored and accessed?
Research results are stored as MCP resources, accessible via URIs and integrated into the MCP client's resource panel.