trino_mcp
Trino MCP Server offers a Model Context Protocol server enabling AI models to interact with Trino's SQL query engine. It provides structured access and supports Docker and standalone deployment options, focusing on reliable STDIO transport.
Trino MCP Server
Model Context Protocol server for Trino, providing AI models with structured access to Trino's distributed SQL query engine.
Features
- Exposes Trino resources through MCP protocol
- Enables AI tools to query and analyze data in Trino
- Offers transport options including STDIO and SSE
- Supports Docker container API and standalone Python API server options
Quick Start
The Trino MCP server can be started using Docker or as a standalone Python API. It allows LLMs to execute SQL queries through simple command-line tools and REST APIs.
LLM Integration
The server allows LLMs direct access to query Trino instances via command-line interfaces and REST APIs. It supports complex SQL queries, allowing LLMs to analyze and manipulate data efficiently.
Usage
The server can be initialized with Docker, and various scripts are provided to generate data, run queries, and validate the API.
Known Issues and Fixes
- Fixed API initialization issues in Docker
- Known issues with SSE transport causing crashes
- Improvements needed in catalog handling.
Project Structure
Organized into directories for source code, examples, scripts, tools, and tests, with key files for running and testing the server and APIs.
Future Work
Planned improvements include integration with newer MCP versions, support for additional connectors, advanced features, authentication, and performance enhancements.