memory-mcp-server

memory-mcp-server

3.5

The Memory MCP Server provides long-term memory capabilities for Large Language Models (LLMs) using the Model Context Protocol (MCP).

Top Comments

The Memory MCP Server is designed to enhance the context-awareness of Large Language Models (LLMs) by providing them with long-term memory capabilities. This server acts as a bridge, allowing LLMs to retain and retrieve information over extended interactions, which is crucial for developing more context-aware AI applications. By adhering to the Model Context Protocol (MCP), the server ensures seamless integration with various LLM architectures. It offers a user-friendly API, making it easy for developers to incorporate memory functionalities into their applications. The server is also scalable, capable of handling multiple requests and growing with the needs of the application.

Features

  • {'name': 'Long-term Memory', 'description': 'Store and retrieve context for LLMs, enhancing their ability to provide relevant responses.'}
  • {'name': 'Model Context Protocol', 'description': 'Adhere to the MCP standards for seamless integration with various LLM architectures.'}
  • {'name': 'User-Friendly API', 'description': 'Easy-to-use API for developers to integrate memory functionalities into their applications.'}
  • {'name': 'Scalability', 'description': 'Designed to handle multiple requests and scale with your needs.'}

Usage with Different Platforms

python

python
# Clone the repository
git clone https://github.com/Sinhan88/memory-mcp-server.git

# Navigate to the project directory
cd memory-mcp-server

# Install dependencies
pip install -r requirements.txt

# Run the server
python app.py

curl

bash
# Store Memory
curl -X POST http://localhost:5000/store -H "Content-Type: application/json" -d '{"context": "I love programming.", "model_id": "model_1"}'

# Retrieve Memory
curl http://localhost:5000/retrieve?model_id=model_1