MCP-SERVER-KB

MCP-SERVER-KB

3.3

This document provides a guide on integrating OpenAI with the Model Context Protocol (MCP) to enable OpenAI to access and utilize tools provided by an MCP server.

The integration of OpenAI with the Model Context Protocol (MCP) allows OpenAI to dynamically use tools provided by an MCP server. This setup involves creating an MCP server that exposes a knowledge base tool, connecting OpenAI to this server, and enabling OpenAI to use these tools when responding to user queries. The communication between the client and server is facilitated through SSE transport, allowing them to run in different processes. The data flow involves the user sending a query, OpenAI receiving the query and available tools, selecting the appropriate tool, and the MCP server executing the tool to provide a response. The MCP acts as a standardized bridge, offering a consistent interface, abstraction, security, and flexibility for AI models to interact with backend systems.

Features

  • Standardized Interface: MCP provides a consistent way for AI models to interact with backend tools.
  • Abstraction: MCP abstracts the complexity of backend systems, simplifying AI integration.
  • Security: MCP allows control over what tools and data are exposed to AI models.
  • Flexibility: Backend implementations can change without affecting AI integration.
  • Dynamic Tool Usage: OpenAI can dynamically select and use tools based on user queries.

MCP Tools

  • {'get_knowledge_base': 'Retrieves Q&A pairs from a JSON file about company policies.'}

Usage with Different Platforms

server

docker build -t ashujss11/mcp-server .
docker run -p 8050:8050 -d --name mcp-server ashujss11/mcp-server

client

python client.py