MCP-Server
Claude AI Documentation Assistant is a powerful MCP server that enhances Claude with advanced documentation search capabilities.
š¤ Claude AI Documentation Assistant š
⨠Features
- š Smart Documentation Search - Search across multiple AI/ML library documentation
- š§ Claude Integration - Seamless connection with Claude's advanced reasoning capabilities
- š Intelligent Web Search - Leverages Serper API for targeted documentation lookup
- šØ Fast Response Times - Optimized for quick retrieval and processing
- š§© Extendable Architecture - Easily add more documentation sources
š Prerequisites
- š Python 3.8 or higher
- š Claude Pro subscription
- š Serper API key (Get one here)
- š» Claude Desktop application
š Quick Start
1ļøā£ Installation
# Clone the repository
git clone https://github.com/your-username/claude-docs-assistant.git
cd claude-docs-assistant
# Create a virtual environment (recommended)
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
2ļøā£ Configuration
Create a .env
file in the project root with your API keys:
SERPER_API_KEY=your_serper_api_key_here
3ļøā£ Start the MCP Server
python main.py
You should see output indicating the server is running and waiting for Claude to connect.
4ļøā£ Connect Claude Desktop App
- š± Open the Claude Desktop App
- āļø Click on your profile icon and select "Settings"
- š§° Navigate to the "Tools" section
- ā Click "Add Tool"
- š Select "Connect to a local tool"
- š„ļø Follow the prompts to connect to your running MCP server
- ā Confirm the connection is successful
š® Using Your Claude Documentation Assistant
Once connected, you can start asking Claude questions that will trigger the documentation search. For example:
Could you explain how to use FAISS with LangChain? Please search the langchain documentation to help me.
Claude will automatically use your MCP server to:
- š Search for relevant documentation
- š„ Retrieve the content
- š§ Process and explain the information
š§ Under the Hood
š Code Structure
claude-docs-assistant/
āāā main.py # MCP server implementation
āāā requirements.txt # Project dependencies
āāā .env # Environment variables (API keys)
āāā README.md # This documentation
š Supported Libraries
The assistant currently supports searching documentation for:
- š¦ LangChain:
python.langchain.com/docs
- š¦ LlamaIndex:
docs.llamaindex.ai/en/stable
- š§ OpenAI:
platform.openai.com/docs
š§© How It Works
- š” The MCP server exposes a
get_docs
tool to Claude - š When invoked, the tool searches for documentation using Serper API
- š Results are scraped for their content
- š Content is returned to Claude for analysis and explanation
š ļø Advanced Configuration
Adding New Documentation Sources
Extend the docs_urls
dictionary in main.py
:
docs_urls = {
"langchain": "python.langchain.com/docs",
"llama-index": "docs.llamaindex.ai/en/stable",
"openai": "platform.openai.com/docs",
"huggingface": "huggingface.co/docs", # Add new documentation sources
"tensorflow": "www.tensorflow.org/api_docs",
}
Customizing Search Behavior
Modify the search_web
function to adjust the number of results:
payload = json.dumps({"q": query, "num": 5}) # Increase from default 2
š Troubleshooting
Common Issues
- š« "Connection refused" error: Ensure the MCP server is running before connecting Claude
- ā±ļø Timeout errors: Check your internet connection or increase the timeout value
- š API key issues: Verify your Serper API key is correct in the
.env
file
Debugging Tips
Add more detailed logging by modifying the main.py file:
import logging
logging.basicConfig(level=logging.DEBUG)
š Performance Optimization
- ā” For faster response times, consider caching frequently accessed documentation
- š§ Limit the amount of text returned to Claude to avoid token limitations
- š Use more specific queries to get more relevant documentation
š¤ Contributing
Contributions are welcome! Here's how you can help:
- š“ Fork the repository
- šæ Create a feature branch (
git checkout -b feature/amazing-feature
) - š¾ Commit your changes (
git commit -m 'Add some amazing feature'
) - š¤ Push to the branch (
git push origin feature/amazing-feature
) - š Open a Pull Request
š License
This project is licensed under the MIT License - see the LICENSE file for details.
š Acknowledgements
- Anthropic for creating Claude
- Serper.dev for their search API
- All the open-source libraries that make this project possible
Related MCP Servers
View all knowledge_and_memory servers āgit-mcp
by idosal
GitMCP is a free, open-source, remote Model Context Protocol (MCP) server that transforms GitHub projects into documentation hubs, enabling AI tools to access up-to-date documentation and code.
Knowledge Graph Memory Server
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph, allowing Claude to remember information about the user across chats.
mcpdoc
by langchain-ai
MCP LLMS-TXT Documentation Server provides a structured way to manage and retrieve LLM documentation using the Model Context Protocol.
rust-docs-mcp-server
by Govcraft
The Rust Docs MCP Server provides an up-to-date knowledge source for specific Rust crates, enhancing the accuracy of AI coding assistants by allowing them to query current documentation.
mindmap-mcp-server
by YuChenSSR
A Model Context Protocol (MCP) server for converting Markdown content to interactive mindmaps.
algorand-mcp
by GoPlausible
This is a Model Context Protocol (MCP) implementation for Algorand blockchain interactions, providing a server package for blockchain interactions and a client package for wallet management and transaction signing.
mcp-obsidian
by MarkusPfundstein
MCP server to interact with Obsidian via the Local REST API community plugin.