mcp-server-ragdocs
This project is an MCP server designed to enhance AI tools by providing vector search capabilities for documentation retrieval and processing. It allows integrations with various embeddings providers for semantic search and context augmentation.
What is the primary function of MCP-server-ragdocs?
The primary function is to enhance AI responses by integrating relevant documentation through vector search, enabling context-aware AI systems.
What are the supported embedding providers?
The server supports local embeddings generation using Ollama and also supports OpenAI for embedding generation.
How can I deploy MCP-server-ragdocs locally?
You can deploy it locally using Docker Compose, which starts the Qdrant vector database and Ollama LLM service.
Can I use MCP-server-ragdocs with cloud services?
Yes, you can use hosted Qdrant Cloud services for production deployments by setting the appropriate environment variables.
What is the license for MCP-server-ragdocs?
MCP-server-ragdocs is licensed under the MIT License, allowing free use, modification, and distribution.