mcp-vertexai-search

mcp-vertexai-search

17

This project is an MCP server that facilitates document search utilizing Vertex AI and Gemini for improved search results. It supports Docker and requires configuration through a YAML file. It offers flexibility in transporting data via SSE and stdio.

What is the purpose of grounding in Vertex AI?

Grounding in Vertex AI improves the quality of search results by anchoring responses in your private data, ensuring more relevant and accurate results.

Can the MCP server integrate with multiple data stores?

Yes, the MCP server can integrate with multiple Vertex AI data stores, allowing for a comprehensive search across various datasets.

What transport methods does the MCP server support?

The MCP server supports both SSE (Server-Sent Events) and stdio (Standard Input Output) transports, which can be configured using the --transport flag.

Is there a configuration template available?

Yes, a configuration template is available in the form of config.yml.template, which can be modified to suit specific requirements.

How can I test the Vertex AI Search without running the MCP server?

You can test the Vertex AI Search using the mcp-vertexai-search search command with the appropriate configuration and query parameters.