openweathermap-mcp-server
OpenWeatherMap MCP Server is designed to deliver weather data from OpenWeatherMap API to AI models through a Model Context Protocol server. It features tools for weather queries and data caching, enhancing AI capabilities with real-time weather access.
OpenWeatherMap MCP Server
This project provides a Model Context Protocol (MCP) server that connects to the OpenWeatherMap API and caches weather data in Redis/Valkey. It demonstrates how to build a modular MCP server that AI models can use to access external APIs.
Features
- Retrieve current weather conditions for any city
- Get 5-day weather forecasts
- Query weather by geographic coordinates
- Cache weather data in Redis for improved performance
- SSE (Server-Sent Events) transport for MCP communication
- Stdio (Standard Input Output) transport for MCP communication
Prerequisites
- Node.js 18+ and npm/yarn
- Redis server running locally or accessible via URL
- OpenWeatherMap API key
Environment Variables
Create a .env
file in the project root with:
OPENWEATHER_API_KEY=your_api_key_here
REDIS_URL=redis://localhost:6379
REDIS_HOST=localhost
REDIS_POST=6379
PORT=10203
Installation
# Install dependencies
bun install
# Build TypeScript
bun run build
# Start the SSE server
bun start:sse
# Start the Stdio server
bun start:stdio
# Start the MCP Inspector for debugging
bun start:stdio
Project Structure
openweather-mcp-server.ts
: Core MCP server with weather tools and Redis cachingsse-server.ts
: Express server implementing SSE transport for MCPmcp-client.ts
: Example client to test the MCP server
Available Tools
The MCP server provides the following tools:
get-current-weather
: Get current weather for a city
- Parameters:
city
(string)
get-weather-forecast
: Get 5-day forecast for a city
- Parameters:
city
(string)
get-weather-by-coordinates
: Get weather using latitude/longitude
- Parameters:
lat
(number),lon
(number)
clear-weather-cache
: Clear cached weather data
- Parameters:
city
(string, optional) - If not provided, clears all cache
Integration with AI Models
This MCP server is designed to be used with AI models that support the Model Context Protocol. By connecting to this server, AI models can:
- Access real-time weather data
- Query forecasts
- Provide location-specific weather information
Example Usage (with Claude or similar model)
Once the server is running, you can instruct an AI model to use it:
"Please connect to my weather MCP server at http://localhost:10203/sse and tell me the current weather in Paris."
The AI would then:
- Connect to the MCP server
- Call the
get-current-weather
tool withcity: "Paris"
- Return the formatted weather information
Extending the Server
To add new functionality:
- Add new tools to
weather-mcp-server.ts
using themcpServer.tool()
method - Update the server info in
sse-server.ts
to document your new tool - Implement any required helper functions and caching logic
Redis Caching
The server uses Redis to cache weather data with a 30-minute TTL (Time To Live). This reduces API calls to OpenWeather and improves response times.
To clear the cache manually, use the clear-weather-cache
tool.