mcp-fal
40
The fal.ai MCP Server is a server designed to interact with fal.ai models and services. It allows users to list, search, and utilize models, manage execution queues, and upload files to fal.ai CDN. The server supports both direct and queued model execution.
fal.ai MCP Server
A Model Context Protocol (MCP) server for interacting with fal.ai models and services.
Features
- List all available fal.ai models
- Search for specific models by keywords
- Get model schemas
- Generate content using any fal.ai model
- Support for both direct and queued model execution
- Queue management (status checking, getting results, cancelling requests)
- File upload to fal.ai CDN
Requirements
- Python 3.10+
- fastmcp
- httpx
- aiofiles
- A fal.ai API key
Installation
- Clone this repository:
git clone https://github.com/am0y/mcp-fal.git
cd mcp-fal
- Install the required packages:
pip install fastmcp httpx aiofiles
- Set your fal.ai API key as an environment variable:
export FAL_KEY="YOUR_FAL_API_KEY_HERE"
Usage
Running the Server
You can run the server in development mode with:
fastmcp dev main.py
This will launch the MCP Inspector web interface where you can test the tools interactively.
Installing in Claude Desktop
To use the server with Claude Desktop:
fastmcp install main.py -e FAL_KEY="YOUR_FAL_API_KEY_HERE"
This will make the server available to Claude in the Desktop app.
Running Directly
You can also run the server directly:
python main.py
API Reference
Tools
models(page=None, total=None)
- List available models with optional paginationsearch(keywords)
- Search for models by keywordsschema(model_id)
- Get OpenAPI schema for a specific modelgenerate(model, parameters, queue=False)
- Generate content using a modelresult(url)
- Get result from a queued requeststatus(url)
- Check status of a queued requestcancel(url)
- Cancel a queued requestupload(path
- Upload a file to fal.ai CDN