mcp-mem0
315
MCP-Mem0 is a template MCP server for AI agents to use long-term memory, integrated with Mem0. It provides persistent memory capabilities, allowing storage, retrieval, and semantic search functionalities, serving as a basis for developing other MCP servers.
Overview
This project demonstrates how to build an MCP server that enables AI agents to store, retrieve, and search memories using semantic search. It serves as a practical template for creating your own MCP servers, simply using Mem0 and a practical example.
Features
- Store any information in long-term memory with semantic indexing
- Retrieve all stored memories for comprehensive context
- Find relevant memories using semantic search
Prerequisites
- Python 3.12+
- Supabase or any PostgreSQL database
- API keys for your chosen LLM provider
- Docker if running the MCP server as a container
Installation
Using uv
- Install uv
- Clone this repository
- Install dependencies
- Create a
.env
file based on.env.example
- Configure your environment variables in the
.env
file
Using Docker (Recommended)
- Build the Docker image
- Create a
.env
file and configure your environment variables
Running the Server
Using uv
- Set TRANSPORT=sse in .env then:
uv run src/main.py
Using Docker
- Run:
docker run --env-file .env -p:8050:8050 mcp/mem0
Integration with MCP Clients
- Use SSE Configuration to connect using
url
: "http://localhost:8050/sse"
Building Your Own Server
- Add your own tools by creating methods
- Create your own lifespan function
- Modify the
utils.py
file - Add prompts and resources if needed.