mcp-gemini-server
0
The Gemini AI MCP Server is designed to integrate Google's Gemini AI models with Claude CLI using the MCP protocol. It supports multiple models, provides structured JSON responses, and offers robust error handling with a flexible command-line interface.
Gemini AI MCP Server
This project integrates Google's Gemini AI models with Claude CLI via the MCP protocol.
Core Features
- Seamless integration with Claude CLI through standardized MCP protocol
- Compatible with various Gemini AI models including 2.5 Pro and 1.5 series
- Structured JSON responses for reliable parsing and integration
- Robust error handling with informative error messages
- Flexible command-line interface with optional parameters
- Simple API key configuration through environment variables
Setup
- Ensure you have the Google Generative AI Python package installed.
- Set your Google API key as an environment variable.
- Make sure the shell script is executable.
Usage with Claude CLI
- Configure Claude CLI to use this MCP server by adding the directory to your MCP search path.
- Use the Gemini command through Claude CLI.
- Optionally specify a different Gemini model.
Testing the Server Directly
You can test the server directly without Claude CLI.
Available Models
- gemini-2.5-pro-preview-03-25 (default)
- gemini-1.5-pro
- gemini-1.5-flash
MCP Integration
This server implements the MCP protocol for seamless integration with Claude CLI. The mcp.json
file defines the available commands and parameters.