mcp-vertexai-search
17
This project is an MCP server that facilitates document search utilizing Vertex AI and Gemini for improved search results. It supports Docker and requires configuration through a YAML file. It offers flexibility in transporting data via SSE and stdio.
MCP Server for Vertex AI Search
This project is an MCP server designed to search documents using Vertex AI. It leverages Gemini with Vertex AI grounding to enhance search results by utilizing your private data stored in Vertex AI Datastore. The server can integrate with one or multiple Vertex AI data stores. You have the option to use Docker for running the server. The Python package can be installed from the repository, and requires a configuration file derived from a provided template.
Architecture
- Uses Gemini with Vertex AI grounding for document search.
- Integrates multiple Vertex AI data stores.
How to Use
- Clone the repository and create a virtual environment.
- Install the Python package and configure using a YAML file.
- Supports SSE and stdio for server transport options.
Development
- Prerequisites: uv, Vertex AI data store.
- Set up local development with virtual environments.
Config File
- Requires configuration for server and model details.
- Supports multiple data stores with specific settings.