otterbridge

otterbridge

3

OtterBridge is a lightweight server designed to connect applications to various Large Language Model providers. It offers adaptability and a clean interface for different use cases, supporting provider-agnostic LLM functionalities.

OtterBridge

OtterBridge is a lightweight, flexible server for connecting applications to various Large Language Model providers. It provides a clean interface to LLMs while maintaining adaptability for different use cases.

Features

  • Provider-Agnostic: Designed to work with multiple LLM providers.
  • Simple, Composable Design: Following best practices for LLM agent architecture.
  • Lightweight Server: Built with FastMCP for reliable, efficient server implementation.
  • Model Management: Easy access to model information and capabilities.

Prerequisites

  • Ollama installed and running
  • uv for Python package management

Installation

  1. Clone the repository
  2. Install dependencies using uv
  3. Create a .env file

Usage

Starting the Server
  • Manual start for testing purposes.
  • Automatic start with MCP clients.

Configuration

Configured using environment variables specifying Ollama server URL and default model.

Roadmap

  • Q2 2025: Support for ChatGPT API integration
  • Q3 2025: Support for Claude API integration

License

Licensed under the .