MCP-Get-Started
A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server.
MCP Knowledge Base Assistant
A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server. This project showcases how to build a simple but powerful AI assistant that can answer questions about company policies by accessing a knowledge base through MCP.
š Overview
This project demonstrates:
- How to build an MCP server that exposes a knowledge base as a tool
- How to create an MCP client that connects to the server
- How to integrate OpenAI's API to create a natural language interface
- How to use Docker to containerize the server component
The system allows users to ask questions in natural language about company policies, and the AI will retrieve relevant information from the knowledge base to provide accurate answers.
šļø Architecture
The project follows the MCP client-host-server architecture:
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā ā ā ā ā ā
ā OpenAI Model āāāāāā⤠MCP Client āāāāāā⤠MCP Server ā
ā (GPT-4o) ā ā (client.py) ā ā (server.py) ā
ā ā ā ā ā ā
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā
ā¼
āāāāāāāāāāāāāāāāāāā
ā ā
ā Knowledge Base ā
ā (kb.json) ā
ā ā
āāāāāāāāāāāāāāāāāāā
- MCP Server: Exposes the knowledge base as a tool that can be queried
- MCP Client: Connects to the server and integrates with OpenAI's API
- OpenAI Model: Processes natural language queries and generates responses
- Knowledge Base: JSON file containing Q&A pairs about company policies
š Getting Started
Prerequisites
- Python 3.11 or higher
- Docker (optional, for containerized server)
- OpenAI API key
Installation
-
Clone the repository:
git clone <repository-url> cd MCP-Get-Started
-
Create a virtual environment and install dependencies:
python -m venv venv # On Windows venv\Scripts\activate # On macOS/Linux source venv/bin/activate pip install -r requirements.txt
-
Create a
.env
file in the project root with your OpenAI API key:OPENAI_API_KEY=your_openai_api_key_here
Running the Server
Option 1: Run directly with Python
python server.py
Option 2: Run with Docker
# Build the Docker image
docker build -t mcp-server .
# Run the container
docker run -p 8050:8050 mcp-server
Running the Client
With the server running, open a new terminal and run:
python client.py
š Usage
The client will connect to the server and ask a sample question about the company's equal opportunity policy. You can modify the query in client.py
to ask different questions about company policies.
Example output:
Connected to server with tools:
- get_knowledge_base: Retrieve the entire knowledge base as a formatted string.
Query: What is the company's equal opportunity policy?
Response: The company's equal opportunity policy is as follows: The company is an equal opportunity employer and prohibits discrimination based on race, gender, age, religion, disability, or any other protected characteristic.
š§ Project Structure
server.py
: MCP server implementation that exposes the knowledge baseclient.py
: MCP client that connects to the server and integrates with OpenAIdata/kb.json
: Knowledge base containing company policy Q&A pairsDockerfile
: Configuration for containerizing the serverrequirements.txt
: Python dependencies
š§© How It Works
MCP Server (server.py)
The server exposes a single tool called get_knowledge_base
that retrieves information from the knowledge base file (data/kb.json
). It runs using the SSE (Server-Sent Events) transport on port 8050.
MCP Client (client.py)
The client:
- Connects to the MCP server using SSE transport
- Retrieves the list of available tools
- Takes a natural language query from the user
- Sends the query to OpenAI along with the available tools
- If OpenAI decides to use a tool, the client executes the tool call
- Returns the final response to the user
Knowledge Base (data/kb.json)
The knowledge base is a simple JSON file containing question-answer pairs about company policies. This can be extended with additional policies or information as needed.
š Lifecycle Management
The client implements proper lifecycle management using Python's async context managers:
async with MCPOpenAIClient() as client:
await client.connect_to_server()
response = await client.process_query("What is the policy?")
# Resources are automatically cleaned up when exiting the context
This ensures that all resources are properly initialized and cleaned up, following MCP best practices.
š ļø Customization
Adding More Knowledge
To expand the knowledge base, simply add more question-answer pairs to the data/kb.json
file.
Adding More Tools
You can add more tools to the server by defining additional functions with the @mcp.tool()
decorator in server.py
.
Changing the Model
To use a different OpenAI model, modify the model
parameter in the MCPOpenAIClient
class in client.py
.
š Learn More About MCP
The Model Context Protocol (MCP) is a standardized way for LLMs to interact with external tools and services. It provides:
- Reusability: Build a server once, use it with any MCP-compatible client
- Composability: Combine multiple servers to create complex capabilities
- Ecosystem growth: Benefit from servers created by others
For more information, visit the MCP documentation.
š License
Related MCP Servers
View all knowledge_and_memory servers āgit-mcp
by idosal
GitMCP is a free, open-source, remote Model Context Protocol (MCP) server that transforms GitHub projects into documentation hubs, enabling AI tools to access up-to-date documentation and code.
Knowledge Graph Memory Server
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph, allowing Claude to remember information about the user across chats.
mcpdoc
by langchain-ai
MCP LLMS-TXT Documentation Server provides a structured way to manage and retrieve LLM documentation using the Model Context Protocol.
mindmap-mcp-server
by YuChenSSR
A Model Context Protocol (MCP) server for converting Markdown content to interactive mindmaps.
algorand-mcp
by GoPlausible
This is a Model Context Protocol (MCP) implementation for Algorand blockchain interactions, providing a server package for blockchain interactions and a client package for wallet management and transaction signing.
basic-memory
by basicmachines-co
Basic Memory is a tool that allows users to build a persistent knowledge base through natural conversations with LLMs, storing information in Markdown files.
mcp-obsidian
by MarkusPfundstein
MCP server to interact with Obsidian via the Local REST API community plugin.