turbular
Turbular is an open-source MCP server aimed at facilitating seamless database connectivity for Language Models (LLMs). It is robust in supporting multiple database types and offers secure, optimized, and easily extensible features for sophisticated AI applications.
Turbular
Turbular is an open-source Model Context Protocol (MCP) server that enables seamless database connectivity for Language Models (LLMs). It provides a unified API interface to interact with various database types, making it perfect for AI applications that need to work with multiple data sources.
โจ Features
- ๐ Multi-Database Support: Connect to various database types through a single API
- ๐ Schema Normalization: Automatically normalize database schemas to correct naming conventions for LLM compatibility
- ๐ Secure Connections: Support for SSL and various authentication methods
- ๐ High Performance: Optimizes your LLM generated queries
- ๐ Query Transformation: Let LLM generate queries against normalized layouts and transform them into their unnormalized form
- ๐ณ Docker Support: Easy deployment with Docker and Docker Compose
- ๐ง Easy to Extend: Adding new database providers can be easily done by extending the
๐๏ธ Supported Databases
Database Type | Status | Icon |
---|---|---|
PostgreSQL | โ | |
MySQL | โ | |
SQLite | โ | |
BigQuery | โ | |
Oracle | โ | |
MS SQL | โ | |
Redshift | โ | ![]() |
๐ Quick Start
Using Docker (Recommended)
-
Clone the repository:
git clone https://github.com/raeudigerRaeffi/turbular.git cd turbular
-
Start the development environment:
docker-compose -f docker-compose.dev.yml up --build
-
Test the connection:
./scripts/test_connection.py
Manual Installation
-
Install Python 3.11 or higher
-
Install dependencies:
pip install -r requirements.txt
-
Run the server:
uvicorn app.main:app --reload
๐ API Reference
Database Operations
Get Database Schema
POST /get_schema
Retrieve the schema of a connected database for your LLM agent.
Parameters:
db_info
: Database connection argumentsreturn_normalize_schema
(optional): Return schema in LLM-friendly format
Execute Query
POST /execute_query
Optimizes query and then execute SQL queries on the connected database.
Parameters:
db_info
: Database connection argumentsquery
: SQL query stringnormalized_query
: Boolean indicating if query is normalizedmax_rows
: Maximum number of rows to returnautocommit
: Boolean for autocommit mode
File Management
Upload BigQuery Key
POST /upload-bigquery-key
Upload a BigQuery service account key file.
Parameters:
project_id
: BigQuery project IDkey_file
: JSON key file
Upload SQLite Database
POST /upload-sqlite-file
Upload a SQLite database file.
Parameters:
database_name
: Name to identify the databasedb_file
: SQLite database file (.db or .sqlite)
Utility Endpoints
Health Check
GET /health
Verify if the API is running.
List Supported Databases
GET /supported-databases
Get a list of all supported database types.
๐ง Development Setup
-
Fork and clone the repository
-
Create a development environment:
docker-compose -f docker-compose.dev.yml up --build
-
The development server includes:
- FastAPI server with hot reload
- PostgreSQL test database
- Pre-configured test data
-
Access the API documentation:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
๐ค Contributing
We welcome contributions! Here's how you can help:
- Check out our
- Look for open issues
- Submit pull requests with improvements
- Help with documentation
- Share your feedback
Development Guidelines
- Follow PEP 8 style guide
- Write tests for new features
- Update documentation as needed
- Use meaningful commit messages
Roadmap
- Add more testing, formatting and commit hooks
- Add SSH support for database connection
- Add APIs as datasources using steampipe
- Enable local schema saving for databases to which the server has already connected
- Add more datasources (snowflake, mongodb, excel, etc.)
- Add authentication protection to routes
๐งช Testing
Run the test suite:
pytest
For development tests with the included PostgreSQL:
./scripts/test_connection.py
๐ Documentation
๐ Connection Examples
PostgreSQL
connection_info = {
"database_type": "PostgreSQL",
"username": "user",
"password": "password",
"host": "localhost",
"port": 5432,
"database_name": "mydb",
"ssl": False
}
BigQuery
connection_info = {
"database_type": "BigQuery",
"path_cred": "/path/to/credentials.json",
"project_id": "my-project",
"dataset_id": "my_dataset"
}
SQLite
connection_info = {
"type": "SQLite",
"database_name": "my_database"
}
๐ License
This project is licensed under the MIT License - see the file for details.
๐ Acknowledgments
- FastAPI for the amazing framework
- SQLAlchemy for database support
- @henryclickclack Henry Albert Jupiter Hommel as Co-Developer โค๏ธ
- All our contributors and users
๐ Support
- Create an issue
- Email: