mcp-client-server-example

mcp-client-server-example

1

This project illustrates using an AI agent that interprets and executes user commands locally through Python functions, using MCP and Ollama. It promises autonomy without manual coding, operating fully offline to enable smart, secure AI workflows.

MCP + Ollama Local Tool Calling Example

This project demonstrates how a local AI agent can understand user queries and automatically call Python functions using:

  • Model Context Protocol (MCP)
  • Ollama for running a local LLM (e.g., Llama3)
  • Python MCP Client and Server

Project Structure

  • math_server.py: MCP Server exposing add() and multiply() tools
  • ollama_client.py: MCP Client interacting with Ollama

Setup Instructions

  1. Install Requirements
  2. Pull or run an LLM model
  3. Run the MCP Server
  4. Run the MCP Client
  5. Interact! Example queries allow the agent to call functions and provide responses without manual hardcoding, running fully locally and autonomously.

How It Works

  • MCP Client lists available tools.
  • Sends tools + user query to Ollama LLM.
  • LLM reasons about the best matching tool.
  • LLM generates a tool_call.
  • MCP Client invokes the function via the MCP Server.
  • Final result is returned and displayed.

Why This Matters

This pattern enables building smart local AI agents that understand user intent, dynamically select the correct actions, and operate fully offline and locally.