langgraph_mcp
LangGraph Documentation MCP Server is an MCP server providing access to LangGraph documentation with vector store-based retrieval using Ollama embeddings for document vectorization. It supports both semantic search tools and full documentation access.
LangGraph Documentation MCP Server
This project implements a Mod Control Protocol (MCP) server that provides access to LangGraph documentation through a vector store-based retrieval system. The implementation is based on the MCP From Scratch tutorial and has been updated to use Ollama for embeddings.
Features
- Vector store-based document retrieval using SKLearnVectorStore
- Ollama embeddings for document vectorization
- MCP server implementation with FastMCP
- Document loading and processing from LangGraph documentation
- Support for both tool-based queries and full documentation access
Prerequisites
- Python 3.12+
- Ollama installed and running locally (default port: 11434)
- Required Python packages
- langchain_community langchain-anthropic langchain_ollama scikit-learn bs4 pandas pyarrow matplotlib lxml langgraph tiktoken "mcp[cli]
Installation
- Clone the repository
- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
- Install dependencies:
pip install langchain_community langchain-anthropic langchain_ollama scikit-learn bs4 pandas pyarrow matplotlib lxml langgraph tiktoken "mcp[cli]"
Project Structure
langgraph_mcp.py
: Main MCP server implementationbuild-tool.ipynb
: Jupyter notebook for building and testing the vector storellms_full.txt
: Generated documentation filesklearn_vectorstore.parquet
: Vector store file
Usage
- First, build the vector store using the Jupyter notebook:
jupyter notebook build-tool.ipynb
- Run the MCP server:
python langgraph_mcp.py
Available Tools
langgraph_query_tool
A tool that queries the LangGraph documentation using semantic search:
@mcp.tool()
def langgraph_query_tool(query: str):
"""
Query the LangGraph documentation using a retriever.
Args:
query (str): The query to search the documentation with
Returns:
str: A str of the retrieved documents
"""
Full Documentation Access
Access the complete LangGraph documentation through the resource endpoint:
@mcp.resource("docs://langgraph/full")
def get_all_langgraph_docs() -> str:
"""
Get all the LangGraph documentation.
"""
Implementation Details
The project uses:
- Ollama embeddings with the
nomic-embed-text
model - SKLearnVectorStore for document storage and retrieval
- BeautifulSoup for HTML parsing
- RecursiveUrlLoader for documentation scraping
- RecursiveCharacterTextSplitter for document chunking
Credits
This implementation is based on:
License
This project is open source and available under the MIT License.