mcp-server-openai

mcp-server-openai

2

The OpenAI MCP Server project allows users to query OpenAI models using the Model Context Protocol. It supports multiple models and provides features such as configurable message formatting and error handling, making it adaptable for various applications.

OpenAI MCP Server

  • Description: This project enables querying OpenAI models directly using the MCP protocol. Notably, it adds support for o3-mini and gpt-4o-mini models with enhanced message handling.

Features

  • Integration with OpenAI's API
  • Multiple model support: o3-mini (default) and gpt-4o-mini
  • Configurable message formatting
  • Error handling and logging

Installation

  • Install via Smithery or manually by cloning the repository and installing dependencies.
  • Configure MCP settings for Claude Desktop.
  • Obtain an OpenAI API Key and restart Claude for changes to take effect.

Usage

  • Use the ask-openai tool to query OpenAI models using the use_mcp_tool command in Claude.
  • Model Comparison: o3-mini for concise answers, gpt-4o-mini for detailed explanations.

Troubleshooting

  • Server not found: Check PYTHONPATH, Python, and pip installation.
  • Authentication errors: Verify OpenAI API key.
  • Model errors: Use supported models, check query content.

Development

  • Install development dependencies and run tests using pytest.

Changes from Original

  • Added o3-mini and gpt-4o-mini models, improved formatting, and enhanced documentation.