MCP-Get-Started

MCP-Get-Started

3.2

A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server.

MCP Knowledge Base Assistant

A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server. This project showcases how to build a simple but powerful AI assistant that can answer questions about company policies by accessing a knowledge base through MCP.

šŸ“‹ Overview

This project demonstrates:

  1. How to build an MCP server that exposes a knowledge base as a tool
  2. How to create an MCP client that connects to the server
  3. How to integrate OpenAI's API to create a natural language interface
  4. How to use Docker to containerize the server component

The system allows users to ask questions in natural language about company policies, and the AI will retrieve relevant information from the knowledge base to provide accurate answers.

šŸ—ļø Architecture

The project follows the MCP client-host-server architecture:

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”     ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”     ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│                 │     │                 │     │                 │
│  OpenAI Model   │◄────┤  MCP Client     │◄────┤  MCP Server     │
│  (GPT-4o)       │     │  (client.py)    │     │  (server.py)    │
│                 │     │                 │     │                 │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜     ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜     ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                                        │
                                                        ā–¼
                                               ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                                               │                 │
                                               │  Knowledge Base │
                                               │  (kb.json)      │
                                               │                 │
                                               ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
  • MCP Server: Exposes the knowledge base as a tool that can be queried
  • MCP Client: Connects to the server and integrates with OpenAI's API
  • OpenAI Model: Processes natural language queries and generates responses
  • Knowledge Base: JSON file containing Q&A pairs about company policies

šŸš€ Getting Started

Prerequisites

  • Python 3.11 or higher
  • Docker (optional, for containerized server)
  • OpenAI API key

Installation

  1. Clone the repository:

    git clone <repository-url>
    cd MCP-Get-Started
    
  2. Create a virtual environment and install dependencies:

    python -m venv venv
    
    # On Windows
    venv\Scripts\activate
    
    # On macOS/Linux
    source venv/bin/activate
    
    pip install -r requirements.txt
    
  3. Create a .env file in the project root with your OpenAI API key:

    OPENAI_API_KEY=your_openai_api_key_here
    

Running the Server

Option 1: Run directly with Python
python server.py
Option 2: Run with Docker
# Build the Docker image
docker build -t mcp-server .

# Run the container
docker run -p 8050:8050 mcp-server

Running the Client

With the server running, open a new terminal and run:

python client.py

šŸ“ Usage

The client will connect to the server and ask a sample question about the company's equal opportunity policy. You can modify the query in client.py to ask different questions about company policies.

Example output:

Connected to server with tools:
  - get_knowledge_base: Retrieve the entire knowledge base as a formatted string.

Query: What is the company's equal opportunity policy?

Response: The company's equal opportunity policy is as follows: The company is an equal opportunity employer and prohibits discrimination based on race, gender, age, religion, disability, or any other protected characteristic.

šŸ”§ Project Structure

  • server.py: MCP server implementation that exposes the knowledge base
  • client.py: MCP client that connects to the server and integrates with OpenAI
  • data/kb.json: Knowledge base containing company policy Q&A pairs
  • Dockerfile: Configuration for containerizing the server
  • requirements.txt: Python dependencies

🧩 How It Works

MCP Server (server.py)

The server exposes a single tool called get_knowledge_base that retrieves information from the knowledge base file (data/kb.json). It runs using the SSE (Server-Sent Events) transport on port 8050.

MCP Client (client.py)

The client:

  1. Connects to the MCP server using SSE transport
  2. Retrieves the list of available tools
  3. Takes a natural language query from the user
  4. Sends the query to OpenAI along with the available tools
  5. If OpenAI decides to use a tool, the client executes the tool call
  6. Returns the final response to the user

Knowledge Base (data/kb.json)

The knowledge base is a simple JSON file containing question-answer pairs about company policies. This can be extended with additional policies or information as needed.

šŸ”„ Lifecycle Management

The client implements proper lifecycle management using Python's async context managers:

async with MCPOpenAIClient() as client:
    await client.connect_to_server()
    response = await client.process_query("What is the policy?")
# Resources are automatically cleaned up when exiting the context

This ensures that all resources are properly initialized and cleaned up, following MCP best practices.

šŸ› ļø Customization

Adding More Knowledge

To expand the knowledge base, simply add more question-answer pairs to the data/kb.json file.

Adding More Tools

You can add more tools to the server by defining additional functions with the @mcp.tool() decorator in server.py.

Changing the Model

To use a different OpenAI model, modify the model parameter in the MCPOpenAIClient class in client.py.

šŸ“š Learn More About MCP

The Model Context Protocol (MCP) is a standardized way for LLMs to interact with external tools and services. It provides:

  • Reusability: Build a server once, use it with any MCP-compatible client
  • Composability: Combine multiple servers to create complex capabilities
  • Ecosystem growth: Benefit from servers created by others

For more information, visit the MCP documentation.

šŸ“„ License