otterbridge

otterbridge

3

OtterBridge is a lightweight server designed to connect applications to various Large Language Model providers. It offers adaptability and a clean interface for different use cases, supporting provider-agnostic LLM functionalities.

File Structure

├── .env.example           # Example environment variables
├── .gitignore             # Files to exclude from git
├── LICENSE                # Open source license (MIT, Apache, etc.)
├── README.md              # Project documentation
├── server.py              # MCP server implementation (previously fastmcp_server.py)
├── requirements.txt       # Python dependencies
└── src/                   # Source code directory
    ├── __init__.py        # Package initialization
    └── services/          # Services
        ├── __init__.py
        └── ollama.py      # Ollama service

OtterBridge

OtterBridge Logo

OtterBridge is a lightweight, flexible server for connecting applications to various Large Language Model providers. Following the principles of simplicity and composability outlined in Anthropic's guide to building effective agents, OtterBridge provides a clean interface to LLMs while maintaining adaptability for different use cases.

Currently supporting Ollama, with planned expansions to support other providers like ChatGPT and Claude.

Features

  • Provider-Agnostic: Designed to work with multiple LLM providers (currently Ollama, with ChatGPT and Claude coming soon)
  • Simple, Composable Design: Following best practices for LLM agent architecture
  • Lightweight Server: Built with FastMCP for reliable, efficient server implementation
  • Model Management: Easy access to model information and capabilities

Why "OtterBridge"?

Like otters who build connections between riverbanks, OtterBridge creates seamless pathways between your applications and various LLM providers. Just as otters are adaptable and resourceful, OtterBridge adapts to different LLM backends while providing consistent interfaces.

Prerequisites

Before installing OtterBridge, you need to have:

  1. Ollama installed and running on the default port
  2. uv installed for Python package management

Installation

  1. Clone this repository:
git clone https://github.com/yourusername/otterbridge.git
cd otterbridge
  1. Install dependencies using uv:
uv add -r requirements.txt
  1. Create a .env file based on the provided .env.example:
cp .env.example .env
  1. Configure your environment variables in the .env file.

Claude Desktop Integration

For Claude Desktop users, you'll need to add OtterBridge to your Claude Desktop configuration:

  1. Open your Claude Desktop config file
  2. Add the following configuration (adjust the path to match your local installation):
"otterbridge": {
    "command": "uv",
    "args": [
        "--directory",
        "C:\\Path\\To\\Your\\otterbridge",
        "run",
        "server.py"
    ]
}

Usage

Starting the Server

OtterBridge can be started in two ways:

  1. Manual start for testing purposes: :**
uv run server.py
  1. Automatic start with MCP clients:
    • When using compatible MCP clients like Claude Desktop, OtterBridge will start automatically when needed

Available Tools

OtterBridge exposes the following tools via the Model Context Protocol (MCP):

  • chat: Send messages to LLMs and get AI-generated responses
  • list_models: Retrieve information about available language models

Tool Usage Examples

List Available Models

Example response:

{
    "status": "connected",
    "server_status": "online",
    "available_models": ["llama3", "llama3.1:8b", "codellama", "llama3.3", "qwen2.5"],
    "available_models_count": 5,
    "message": "Successfully retrieved available Ollama models"
}

Chat Completion

Example response:

{
    "role": "assistant",
    "content": "I'm doing well, thank you for asking! I'm here and ready to help you with any questions or tasks you might have. How can I assist you today?",
    "model": "llama3:latest"
}

Configuration

OtterBridge can be configured using environment variables:

VariableDescriptionDefault
OLLAMA_BASE_URLURL of the Ollama serverhttp://localhost:11434
DEFAULT_MODELDefault model to usellama3.3

Roadmap

  • Q2 2025: Support for ChatGPT API integration
  • Q3 2025: Support for Claude API integration

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Development Guidelines

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the .

Acknowledgements

otterbridge

OtterBridge is a lightweight, mcp server for connecting applications to various Large Language Model providers.