ollama-mcp
67
The Ollama MCP Server provides integration services between local Ollama LLM models and MCP applications, supporting model management and interaction. It features automatic port management and environment configuration for seamless operation.
Ollama MCP Server
An MCP server for seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.
Features
- List available Ollama models
- Pull new models from Ollama
- Chat with models
- Get detailed model information
- Automatic port management
- Environment variable configuration
Prerequisites
- Node.js v16+
- npm
- Ollama installed locally
Installation
Install via npm or configure in MCP-compatible applications like Claude Desktop or Cline.
Usage
Start the server using default or specified ports. Configure environment variables for the server port and API endpoint.
API Endpoints
- List models
- Pull new models
- Chat with a model
- Get model details
Contributing
Contributions are welcome. The project is licensed under AGPL-3.0 to prevent unauthorized commercial use.