ProbonoBonobo_sui-mcp-server
0
This project implements a Machine Conversation Protocol server for enhanced document retrieval in AI applications. It integrates with FAISS and GitHub to provide a complete RAG workflow. Key features include vector database integration and support for leveraging LLMs in querying.
MCP Server with FAISS for RAG
This project provides a proof-of-concept implementation of a Machine Conversation Protocol (MCP) server that enables an AI agent to query a vector database and retrieve relevant documents for Retrieval-Augmented Generation (RAG).
Features
- FastAPI server with MCP endpoints
- FAISS vector database integration
- Document chunking and embedding
- GitHub Move file extraction and processing
- LLM integration for a complete RAG workflow
- Simple client example
- Sample documents available
Installation
Using pipx (Recommended)
- Install pipx using homebrew on macOS or pip on Windows.
- Install the MCP Server package directly from the project directory using pipx.
Manual Installation
- Clone the repository and install dependencies.
Usage with pipx
- Commands available for downloading and indexing Move files from GitHub, querying the vector database, and using RAG with LLM Integration.
Running the Server
- Start the server with
mcp-server
or customize settings such as host and port.
Complete RAG Pipeline
- Search Query: The user submits a question.
- Retrieval: The system searches the vector database for relevant documents.
- Context Formation: Retrieved documents are formatted into a prompt.
- LLM Generation: The prompt is sent to an LLM with the retrieved context.
- Enhanced Response: The LLM provides an answer based on the retrieved information.
Extending the Project
- Suggestions for extending the project include adding authentication, document processing, support for more document types, monitoring, logging, and more advanced Move language parsing.