MCP-Get-Started

MCP-Get-Started

3.2

A demonstration of the Model Context Protocol (MCP) that connects an OpenAI-powered client to a knowledge base server.

This project showcases how to build a simple but powerful AI assistant that can answer questions about company policies by accessing a knowledge base through MCP. It demonstrates the creation of an MCP server that exposes a knowledge base, an MCP client that connects to the server, and the integration of OpenAI's API to create a natural language interface. The system allows users to ask questions in natural language about company policies, and the AI retrieves relevant information from the knowledge base to provide accurate answers. The architecture follows the MCP client-host-server model, utilizing Docker for containerization.

Features

  • MCP Server: Exposes the knowledge base as a tool that can be queried.
  • MCP Client: Connects to the server and integrates with OpenAI's API.
  • OpenAI Model: Processes natural language queries and generates responses.
  • Knowledge Base: JSON file containing Q&A pairs about company policies.
  • Docker Support: Containerizes the server component for easy deployment.

MCP Tools

  • {'get_knowledge_base': 'Retrieve the entire knowledge base as a formatted string.'}

Usage with Different Platforms

Running the Server with Python

python
python server.py

Running the Server with Docker

bash
# Build the Docker image
docker build -t mcp-server .

# Run the container
docker run -p 8050:8050 mcp-server

Running the Client

python
python client.py