sse-mcp-and-langchain-client-example

sse-mcp-and-langchain-client-example

0

This project provides an example of exposing FastAPI endpoints as Model Context Protocol (MCP) tools using fastapi-mcp. It includes a basic LangChain client that connects to these MCP tools, showcasing the integration of MCP in FastAPI applications for learning and testing.

FastAPI MCP Server + LangChain Client Example

Example project demonstrating how to expose FastAPI endpoints as Model Context Protocol (MCP) tools using FastAPI-MCP. Includes a basic LangChain agent that connects to the local FastAPI server via HTTP/SSE using langchain-mcp-adapters to discover and use the exposed tools.

Prerequisites

  • Python 3.10 or higher
  • uv package manager
  • Node.js and npm
  • Git
  • OpenAI API Key

Getting Started / How to Run

  1. Clone the Repository
  2. Install uv
  3. Set up Environment & Install Dependencies
  4. Create .env File with OpenAI API Key
  5. Run the FastAPI MCP Server
  6. Run the LangChain Client
  7. Optional: Test Server with MCP Inspector
  8. Optional: Test greet_user with LangChain Client

Goal

To build a simple FastAPI server with MCP capabilities for learning and testing purposes, runnable locally and connectable from MCP clients like Cursor.

Project Setup

  • We used uv for package management.
  • Dependencies are installed using uv.

Application

A simple FastAPI app with endpoints: / for a welcome message and /greet/{name} for personalized greetings. fastapi-mcp integration exposes tools under /mcp.

Running the Server

Run directly or with a debugger using launch.json in .vscode.

LangChain Client

Connects to the FastAPI server, discovers tools, and runs queries using an OpenAI LLM.

Cursor Integration

Configure and use the Cursor agent to interact with discovered tools.

Key Learnings

  • uv command basics.
  • fastapi-mcp initialization order.
  • Handling port conflicts and MCP configuration in Cursor.
  • Tool invocation by Cursor agent.