sample-mcp-server-client

sample-mcp-server-client

3.4

This document provides a summary of a Model Context Protocol (MCP) server setup and usage.

The Model Context Protocol (MCP) server is designed to facilitate communication between clients and machine learning models, specifically leveraging the capabilities of LLM (Large Language Models) and MCP technology. The server can be run independently or initiated by the client, providing flexibility in deployment. It supports integration with various model APIs, such as OpenAI, and is currently configured to use the 'qwen3:8b' model. The system is optimized to minimize unnecessary processing by excluding 'thinking tokens', although this can be adjusted for more complex reasoning tasks. The server is set up using a Conda environment and requires specific Python dependencies as listed in the requirements file.

Features

  • Flexible Deployment: The server can be run independently or started by the client.
  • Model Integration: Supports integration with OpenAI API and other model servers.
  • Optimized Processing: Configured to exclude 'thinking tokens' for faster response times.
  • Conda Environment: Utilizes a Conda environment for easy setup and dependency management.
  • Customizable Reasoning: System prompts can be adjusted to include or exclude reasoning processes.

Usage with Different Platforms

conda

bash
conda create -n mcp_env python=3.10
conda activate mcp_env

pip

bash
pip install -r requirements.txt

python_client

bash
python src/client/mcp_client.py

python_server

bash
python src/server/mcp_server.py