Gemini-Router
GeminiRouter is a Model Context Protocol (MCP) server implementation using Google's Gemini Flash 1.5 API, designed to demonstrate modular AI service collaboration through a centralized router.
π MCP Server using Gemini - GeminiRouter
π About
This project is a working implementation of the Model Context Protocol (MCP) integrated with Google's Gemini Flash 1.5 API, designed to demonstrate how modular AI services can collaborate through a centralized router.
π§ What is MCP and How Does It Work Here?
Model Context Protocol (MCP) is an architectural pattern designed to enable modular and context-aware communication between multiple specialized AI agents or services. This project applies MCP principles to build a scalable, intelligent AI system with dedicated microservices, all coordinated through a central router.
Each service (or "context")βwhether for chat, web search, weather info, deep reasoning, or retrieval-augmented generation (RAG)βperforms a distinct function and communicates via lightweight requests routed intelligently through a Router/Client.
βΊ Workflow Overview
Here's how the system works from input to response:
-
User Input: The user sends a query from the frontend UI.
-
Routing Logic:
- The Router/Client receives the query and analyzes the intent.
- Based on keywords, context, or past interactions, it selects the appropriate server (chat, search, RAG, think, etc.) to handle the query.
-
Service Handling:
- Chat Server: If the query is conversational or casual, itβs routed here for quick responses.
- Search Server: For queries that need real-time or factual data (e.g., βWhatβs the capital of Sweden?β), the router invokes Gemini APIs via this module.
- RAG Server: For complex questions needing document-based or external knowledge synthesis, the RAG module uses a retrieval layer + Gemini for answer generation.
- Thinking Server: For logical reasoning or multi-step problem solving, the query is handed over here for deep thought processing.
- Weather (via Search): The Search server also integrates with the OpenWeather API to handle natural language weather queries like "What's the weather in Tokyo?"
-
Response Aggregation:
- The chosen module processes the request and sends a response back to the Router.
- The Router can optionally combine responses from multiple modules if the query requires it (e.g., a response that includes weather + a suggestion).
-
Frontend Output: The final response is returned to the user interface and presented in a clean, conversational format.
π Features
- Modular AI Services: Each function (chat, search, RAG, think) runs independently as a microservice.
- Centralized Context Router: Handles request dispatching, context management, and coordination between services.
- Gemini Flash 1.5: High-performance model for reasoning, generation, and information retrieval.
- OpenWeather Integration: Real-time weather data included in the search context.
- Dockerized Backend: Easily deploy all services in isolation or as a single stack using Docker.
βοΈ Prerequisites
- Python 3.12+
- Node.js & npm
- Docker & Docker Compose
- Environment variables configured in
.env
GEMINI_API_KEY=<your_gemini_api_key>
OPENWEATHER_API_KEY=<your_openweather_api_key>
π§± Architecture
π οΈ How to Run
1. Clone the Repository
git clone https://github.com/AditHash/gemini-mcp-router.git
cd gemini-mcp-router
2. Start the Backend
cd backend
docker-compose up --build
3. Start the Frontend
npm install
npm run dev
ποΈ Project Structure
backend/
β
βββ chat/ # Chat server (basic conversation handling)
βββ search/ # Web search & weather functionality
βββ rag/ # Retrieval-Augmented Generation logic
βββ think/ # Deep reasoning and logic module
βββ router/ # Central request dispatcher and router
β
βββ requirements.txt
βββ .env # API keys for Gemini and OpenWeather
π Future Plans
- βοΈ Integrate frontend into Docker-compose setup
- π¨ Improve UI with modern components and chat UX
- π Add real-time monitoring & logs for each service
- π€ Extend MCP with dynamic memory and agent-based reasoning
- βΊ Build support for auto-scaling and horizontal load distribution
π Notes
- Ensure valid API keys in
.env
for Gemini and OpenWeather services. - Use
docker logs <container_name>
to debug any backend issues.
Related MCP Servers
View all ai_chatbot servers βSequential Thinking
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
exa-mcp-server
by exa-labs
A Model Context Protocol (MCP) server allows AI assistants to use the Exa AI Search API for real-time web searches in a secure manner.
repomix
by yamadashy
Repomix is a tool that packs your codebase into AI-friendly formats, making it easier to use with AI tools like LLMs.
mindsdb
by mindsdb
MindsDB is an open-source server that enables seamless interaction with large-scale federated data using the Model Context Protocol (MCP).
github-mcp-server
by github
The GitHub MCP Server is a Model Context Protocol server that integrates with GitHub APIs for automation and interaction.
mcp-server-calculator
by githejie
A Model Context Protocol server for calculating. This server enables LLMs to use calculator for precise numerical calculations.
claude-task-master
by eyaltoledano
Task Master is a task management system for AI-driven development with Claude, designed to work seamlessly with Cursor AI.