mcp_starter
The Model Context Protocol (MCP) is a standard for building AI applications that can interact with external tools and APIs.
MCP Starter Project
What is MCP?
The Model Context Protocol (MCP) is a standard for building AI applications that can interact with external tools and APIs. It consists of two main components:
- MCP Server: A Python service that defines and exposes tools/functions that can be called by AI models
- MCP Client: A TypeScript/JavaScript client that connects to the MCP server and manages interactions between AI models and tools
Project Structure
mcp_starter/
āāā mcp-server/ # Python MCP server implementation
ā āāā main.py # Server with documentation search tool
ā āāā pyproject.toml # Python dependencies
āāā mcp-clients/ # TypeScript MCP client implementation
āāā index.ts # Express server with HuggingFace integration
āāā package.json # Node.js dependencies
Getting Started
Prerequisites
- Python 3.11 or higher
- Node.js 18 or higher
- Hugging Face API key
- Serper API key for Google Search functionality
Setting Up the Server
- Create a Python virtual environment and activate it:
cd mcp-server
python -m venv .venv
# On Windows
.venv\Scripts\activate
- Install dependencies:
pip install -e .
- Create a
.env
file in themcp-server
directory:
SERPER_API_KEY=your_serper_api_key_here
Setting Up the Client
- Install Node.js dependencies:
cd mcp-clients
npm install
- Create a
.env
file in themcp-clients
directory:
HUGGINGFACE_API_KEY=your_huggingface_api_key_here
- Build the TypeScript code:
npm run build
Running the Application
- Start the MCP server:
cd mcp-server
python main.py
- In a new terminal, start the client server:
cd mcp-clients
node build/index.js ../mcp-server/main.py
Using the API
The client exposes two endpoints:
- Health Check:
GET http://localhost:3000/health
- Chat:
POST http://localhost:3000/chat
Example chat request:
{
"query": "Search the langchain docs for RAG",
"sessionId": "user123"
}
Features
-
Documentation Search Tool: Search documentation for popular AI libraries:
- LangChain
- LlamaIndex
- OpenAI
-
Conversation Management: Maintains chat history per session
-
Tool Integration: Seamlessly integrates AI model responses with tool calls
-
Error Handling: Robust error handling for API calls and tool execution
How It Works
- The MCP server defines tools that can be called by AI models
- The client connects to the MCP server and retrieves available tools
- When a user sends a query:
- The client formats the conversation history
- Sends it to the Hugging Face model
- Extracts and executes tool calls from the model's response
- Returns the final response including tool results
Environment Variables
Server
SERPER_API_KEY
: API key for Google Search functionality
Client
HUGGINGFACE_API_KEY
: API key for accessing Hugging Face models
License
MIT License
Related MCP Servers
View all ai_chatbot servers āSequential Thinking
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
exa-mcp-server
by exa-labs
A Model Context Protocol (MCP) server allows AI assistants to use the Exa AI Search API for real-time web searches in a secure manner.
repomix
by yamadashy
Repomix is a tool that packs your codebase into AI-friendly formats, making it easier to use with AI tools like LLMs.
github-mcp-server
by github
The GitHub MCP Server is a Model Context Protocol server that integrates with GitHub APIs for automation and interaction.
claude-task-master
by eyaltoledano
Task Master is a task management system for AI-driven development with Claude, designed to work seamlessly with Cursor AI.
osp_marketing_tools
by open-strategy-partners
A comprehensive suite of tools for technical marketing content creation, optimization, and product positioning based on Open Strategy Partners' proven methodologies.
Cua Agent
by trycua
cua-mcp-server is a Model Context Protocol (MCP) server for the Computer-Use Agent (CUA), enabling integration with Claude Desktop and other MCP clients.