simple-mcp-server
0
The project features a hybrid architecture MCP server that follows the official Anthropic MCP specification. It includes middleware for communication and uses Ollama's Gemma model for inference, providing a robust environment for managing conversation contexts.
Anthropic Model Context Protocol (MCP) Server with Ollama Integration
A hybrid architecture combining an Anthropic-compatible Model Context Protocol (MCP) server with Ollama/Gemma LLMs for inference.
Components
- MCP Server: Implements the Anthropic MCP protocol on port 3000.
- Middleware: Facilitates communication between clients and Ollama.
- Ollama: Runs the Gemma3:4b model for inference.
- Database: SQLite database for storing conversation contexts.
Features
- Tools for context management and resources to expose conversation history.
- Compatible with any client supporting the Model Context Protocol.
Protocol Compliance
Adheres strictly to the official MCP specification with JSON-RPC 2.0, protocol initialization, tool interface, and resource management.
Requirements
- Python 3.10+
- Docker and Docker Compose
- Ollama with Gemma3:4b model installed.
Installation
Supports setup using either a script or manual Docker-based configuration.
Usage Example
Available for use through middleware or directly via the MCP server with compatible clients.