just-prompt

just-prompt

218

Just Prompt is a Model Control Protocol server that provides a unified interface for interacting with multiple Large Language Model providers. It simplifies sending and managing prompts, supports decision-making processes, and can save its outputs to files. The project highlights ease of integration and flexibility with popular LLM services.

What is Just Prompt?

Just Prompt is a lightweight MCP server that provides a unified interface to various LLM providers, allowing users to send prompts and receive responses from multiple models.

Which LLM providers are supported by Just Prompt?

Just Prompt supports OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama.

How can I send a prompt to multiple models using Just Prompt?

You can use the 'prompt' tool to send a prompt to multiple LLM models, specifying the models with provider prefixes.

Can I save the responses from the models?

Yes, Just Prompt allows you to save responses to files using the 'prompt_from_file_to_file' tool.

How do I list available models for a specific provider?

You can use the 'list_models' tool to list all available models for a specific LLM provider.