just-prompt
Just Prompt is a Model Control Protocol server that provides a unified interface for interacting with multiple Large Language Model providers. It simplifies sending and managing prompts, supports decision-making processes, and can save its outputs to files. The project highlights ease of integration and flexibility with popular LLM services.
Just Prompt - A lightweight MCP server for LLM providers
just-prompt
is a Model Control Protocol (MCP) server offering a unified interface to various Large Language Model (LLM) providers, including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. Its features include sending text prompts from strings or files to multiple models, listing available providers and models, and facilitating automated decision-making processes. Responses can be saved to files, and the server automatically corrects model names using a list of default models.
Tools
- Prompt: Send a prompt to multiple LLM models.
- Prompt from file: Send prompts from a file.
- Prompt from file to file: Save responses as markdown files.
- CEO and board: Decision-making tool based on model responses.
- List providers and models: View available options.
Features
- Unified API for multiple LLM providers.
- Support for text prompts from strings or files.
- Parallel model execution.
- Automatic adjustment of model names.
- Response file saving capability.
- Easy listing of providers and models.