MCP2Lambda
MCP2Lambda is a server that facilitates the interaction between AI models and AWS Lambda functions, enabling models to utilize Lambda's computation and data access capabilities as tools. It supports various configuration modes and strategies for handling Lambda functions, providing a flexible approach to integrating AI functionality with AWS services.
Overview
MCP2Lambda is a server that allows Large Language Models (LLMs) to interact with AWS Lambda functions as tools, without requiring code changes. It serves as a bridge enabling generative AI models to access private resources, execute custom code, and perform real-time data processing using AWS Lambda's capabilities. Key features include autodiscovery of Lambda functions and invocation by passing required parameters. Users can leverage different strategies for handling Lambda functions like Pre-Discovery and Generic Mode. It requires Python 3.12+, AWS account, and an MCP-compatible client.
-
Features:
- Access to real-time and private data.
- Execution of custom code in a sandbox environment.
- Interaction with external services.
-
Prerequisites:
- Python 3.12 or higher
- AWS account with configured credentials
-
Installation:
- Install via Smithery or manually by cloning the repository and configuring AWS credentials.
- Deploy sample Lambda functions using AWS SAM CLI.
Sample Lambda Functions
- CustomerIdFromEmail: Retrieves a customer ID from an email address.
- CustomerInfoFromId: Retrieves detailed customer information using a customer ID.
- RunPythonCode: Executes Python code in a Lambda sandbox.
Using with Amazon Bedrock
Enable interaction with Amazon Bedrock's Converse API and supported models like Claude. Requires Amazon Bedrock access, configuration, and prerequisites like Boto3.
Starting the MCP Server
Start locally using command uv run main.py
in the mcp2lambda
directory.