sse-mcp-and-langchain-client-example
This project provides an example of exposing FastAPI endpoints as Model Context Protocol (MCP) tools using fastapi-mcp. It includes a basic LangChain client that connects to these MCP tools, showcasing the integration of MCP in FastAPI applications for learning and testing.
FastAPI MCP Server + LangChain Client Example
Example project demonstrating how to expose FastAPI endpoints as Model Context Protocol (MCP) tools using FastAPI-MCP. Includes a basic LangChain agent that connects to the local FastAPI server via HTTP/SSE using langchain-mcp-adapters
to discover and use the exposed tools.
Prerequisites
- Python 3.10 or higher
uv
package manager- Node.js and npm
- Git
- OpenAI API Key
Getting Started / How to Run
- Clone the Repository
- Install
uv
- Set up Environment & Install Dependencies
- Create
.env
File with OpenAI API Key - Run the FastAPI MCP Server
- Run the LangChain Client
- Optional: Test Server with MCP Inspector
- Optional: Test
greet_user
with LangChain Client
Goal
To build a simple FastAPI server with MCP capabilities for learning and testing purposes, runnable locally and connectable from MCP clients like Cursor.
Project Setup
- We used
uv
for package management. - Dependencies are installed using
uv
.
Application
A simple FastAPI app with endpoints: /
for a welcome message and /greet/{name}
for personalized greetings. fastapi-mcp
integration exposes tools under /mcp
.
Running the Server
Run directly or with a debugger using launch.json
in .vscode
.
LangChain Client
Connects to the FastAPI server, discovers tools, and runs queries using an OpenAI LLM.
Cursor Integration
Configure and use the Cursor agent to interact with discovered tools.
Key Learnings
uv
command basics.fastapi-mcp
initialization order.- Handling port conflicts and MCP configuration in Cursor.
- Tool invocation by Cursor agent.