mcp-openai-complete
0
The OpenAI Complete MCP Server is designed to bridge LLM clients with OpenAI-compatible APIs via the MCP protocol. It focuses on text completion and handles asynchronous processes and request cancellations. Standout features include efficient handling and configuration through environment variables or Docker.
OpenAI Complete MCP Server
An MCP server providing an interface for using LLM text completion capabilities via the MCP protocol. It connects LLM clients to OpenAI-compatible APIs, focusing exclusively on base model usage.
Features
- Offers a "complete" tool for text completion.
- Handles asynchronous processing efficiently.
- Implements timeouts and request cancellations gracefully.
Usage
Start the server to enable communication with MCP clients.
Docker Usage
Build and run Docker containers with environment variables or a .env file.
Parameters
Includes options such as prompt
, max_tokens
, temperature
, and more to customize completions.