po3_MCP
The Poe o3 MCP Server is a lightweight server implementation that facilitates interaction with OpenAI models via the Poe API. It features model selection, efficient request handling, and integration options for applications using the Model Context Protocol.
Poe o3 MCP Server
A lightweight Model Context Protocol (MCP) server implementation that provides access to OpenAI's o3 model and other models via Poe's API.
Features
- Simple MCP server implementation using FastMCP
- Direct integration with Poe's API
- Model selection via flags in prompts
- Asynchronous request handling
- Comprehensive error handling and logging
- Easy setup and configuration
Prerequisites
- Python 3.8+
- A Poe API key
Usage
Running the MCP Server
The server will start and listen for MCP protocol messages on standard input/output.
Model Selection via Flags
Select different models by adding a flag to your prompt. If no flag is specified, defaults to the o3 model.
Integrating with MCP Clients
Provides tools like o3_query
to send queries and ping
to test connections.
Configuration
Environment variables include POE_API_KEY
and LOG_LEVEL
.
Troubleshooting
Ensure a valid Poe API key, correct dependencies, and active internet connection.