Graph-of-Thought-MCP
The ASR Graph of Thoughts (GoT) Model Context Protocol (MCP) server enhances AI reasoning capabilities by utilizing graph-based reasoning. Designed for advanced integrations, it suits applications like Claude desktop and API-based AI models.
ASR Graph of Thoughts (GoT) Model Context Protocol (MCP) Server
The Advanced Scientific Research (ASR) Graph of Thoughts (GoT) MCP server is a highly efficient implementation of the Model Context Protocol (MCP) that allows for sophisticated reasoning workflows using graph-based representations.
Project Overview
This project implements a Model Context Protocol (MCP) server architecture that leverages a Graph of Thoughts approach to enhance AI reasoning capabilities. It can be connected to AI models or applications like Claude desktop app or API-based integrations.
Running the Project with Docker
This project provides a multi-container Docker setup for both the Python backend (FastAPI) and the static JavaScript client. The setup uses Docker Compose for orchestration.
Exposed Ports
- Backend (python-app): Host:
8082
→ Container:8082
(FastAPI server) - Client (js-client): Host:
80
→ Container:80
(nginx static server)
Build and Run Instructions
- Build and start all services:
Run
docker compose up --build
to build and start the containers. - Access the services:
- Backend API: http://localhost:8082
- Static Client: http://localhost/
Integration with AI Models
This MCP server can be integrated with Claude desktop application, API-based integrations with AI models, and other MCP-compatible clients.