image-mcp-servers

image-mcp-servers

0

This project is an SSE MCP server focusing on image generation via HuggingFace and Replicate APIs. It features easy deployment and configuration, integrating AI model interactions using the MCP Core framework.

Image MCP Servers

Fabelis Building Fabelis Compare Fabelis Castle

This project implements an SSE MCP server that exposes tools for interacting with HuggingFace and Replicate APIs, focusing on image generation and model management. It's built using the MCP Core framework and provides a standardized interface for AI model interactions. Below is a guide to host it yourself!

Features

HuggingFace Logo Replicate Logo

HuggingFace Integration

ToolDescription
Get Model InfoRetrieve detailed information about models on HuggingFace
Get Model Sample ImagesExtract sample images from model READMEs
Get ReadmeFetch the README content for a model
Search ModelsSearch for models on HuggingFace with filtering options
WhoAmIRetrieve information about the current HuggingFace API token

Replicate Integration

ToolDescription
Generate ImageCreate images using Replicate's text-to-image models
Get Model InfoRetrieve detailed information about models on Replicate
Get PredictionCheck the status and retrieve outputs of a prediction
List ModelsList available models on Replicate with optional filtering
WhoAmIRetrieve information about the current Replicate API token

Getting Started

Prerequisites

  • Rust (2024 edition)
  • Docker (optional, for containerized deployment)

Installation

  1. Clone the repository:

    git clone https://github.com/yourusername/image-mcp-servers.git
    cd image-mcp-servers
    
  2. Create a .env file based on the .env.example:

    cp .env.example .env
    
  3. Add your API tokens to the .env file:

    HF_API_TOKEN="your_huggingface_token"
    REPLICATE_API_TOKEN="your_replicate_token"
    
  4. Build the project:

    cargo build
    

Running the Server

cargo run

The server will start on the port specified in the SERVER_PORT environment variable (default: 3000).

Using Docker

The project includes Docker support for easy deployment:

# Start the service
just docker-up service

# Stop the service
just docker-down service

Configuration

The server can be configured using environment variables:

  • SERVER_NAME: Name of the server (default: "image-mcp-servers")
  • SERVER_VERSION: Version of the server (default: "0.1.0")
  • SERVER_PORT: Port to run the server on (default: 3000)
  • HF_API_TOKEN: HuggingFace API token
  • REPLICATE_API_TOKEN: Replicate API token

Features

The project supports two main feature sets that can be enabled or disabled:

  • huggingface: Enables HuggingFace API integration
  • replicate: Enables Replicate API integration

By default, no features are enabled. You can build with specific features:

# Build with only HuggingFace
cargo build --no-default-features --features huggingface

# Build with only Replicate
cargo build --no-default-features --features replicate

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgements

  • MCP Core - The framework used for building this server
  • HuggingFace - For their image generation models
  • Replicate - For their image generation API