mcp-wolframalpha
This project is an MCP server designed to connect chat applications with the Wolfram Alpha API, enabling advanced computational and query capabilities. It features a modular architecture, multi-client support, and an example client using LangChain for real-time interactions.
MCP Wolfram Alpha (Server + Client)
Seamlessly integrate Wolfram Alpha into your chat applications.
This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities.
Included is an MCP-Client example utilizing Gemini via LangChain, demonstrating how to connect large language models to the MCP server for real-time interactions with Wolfram Alpha’s knowledge engine.
Features
-
Wolfram|Alpha Integration for math, science, and data queries.
-
Modular Architecture Easily extendable to support additional APIs and functionalities.
-
Multi-Client Support Seamlessly handle interactions from multiple clients or interfaces.
-
MCP-Client example using Gemini (via LangChain).
-
UI Support using Gradio for a user-friendly web interface to interact with Google AI and Wolfram Alpha MCP server.
Installation
Clone the Repo
git clone https://github.com/ricocf/mcp-wolframalpha.git
cd mcp-wolframalpha
Set Up Environment Variables
Create a .env file based on the example:
-
WOLFRAM_API_KEY=your_wolframalpha_appid
-
GeminiAPI=your_google_gemini_api_key (Optional if using Client method below.)
Install Requirements
pip install -r requirements.txt
Configuration
To use with the VSCode MCP Server:
- Create a configuration file at
.vscode/mcp.json
in your project root. - Use the example provided in
configs/vscode_mcp.json
as a template. - For more details, refer to the VSCode MCP Server Guide.
To use with Claude Desktop:
{
"mcpServers": {
"WolframAlphaServer": {
"command": "python3",
"args": [
"/path/to/src/core/server.py"
]
}
}
}
Client Usage Example
This project includes an LLM client that communicates with the MCP server.
Run with Gradio UI
- Required: GeminiAPI
- Provides a local web interface to interact with Google AI and Wolfram Alpha.
- To run the client directly from the command line:
python main.py --ui
Docker
To build and run the client inside a Docker container:
docker build -t wolframalphaui -f .devops/ui.Dockerfile .
docker run wolframalphaui
UI
- Intuitive interface built with Gradio to interact with both Google AI (Gemini) and the Wolfram Alpha MCP server.
- Allows users to switch between Wolfram Alpha, Google AI (Gemini), and query history.
Run as CLI Tool
- Required: GeminiAPI
- To run the client directly from the command line:
python main.py
Docker
To build and run the client inside a Docker container:
docker build -t wolframalpha -f .devops/llm.Dockerfile .
docker run -it wolframalpha