by ubie-oss
mcp-vertexai-search is a Multi-Cloud Platform (MCP) server that enables document searching using Google Cloud's Vertex AI Search. It leverages Gemini models with Vertex AI grounding to provide accurate and relevant search results from your private data.
mcp-vertexai-search is a MCP (Multi-Cloud Platform) server designed to facilitate document searching using Google Cloud's Vertex AI Search. It leverages Gemini models with Vertex AI grounding to enhance search results by grounding responses in your private data stored in Vertex AI Datastore. This allows for more accurate and relevant search outcomes by integrating with your specific datasets.
There are two primary ways to use this MCP server:
# Clone the repository
git clone git@github.com/ubie-oss/mcp-vertexai-search.git
# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras
# Check the command
uv run mcp-vertexai-search
While not yet on PyPI, you can install it directly from the repository. A config.yml
file (derived from config.yml.template
) is required to run the server.
# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git
# Check the command
mcp-vertexai-search --help
To run the MCP server, use the following command, specifying the configuration file and transport method (stdio or sse):
uv run mcp-vertexai-search serve \
--config config.yml \
--transport <stdio|sse>
You can also test the Vertex AI Search directly without the MCP server:
uv run mcp-vertexai-search search \
--config config.yml \
--query <your-query>
config.yml
) for server settings, model parameters, and data store details.Q: What is Vertex AI Grounding? A: Vertex AI Grounding is a feature that enhances the quality of search results by using your private data stored in Vertex AI Datastore to ground the responses generated by a Gemini model. This ensures that the search results are more relevant and accurate to your specific context.
Q: How do I configure the MCP server?
A: The MCP server is configured using a YAML file, typically named config.yml
. A template, config.yml.template
, is provided, which outlines parameters for the server name, Vertex AI model details (name, project ID, location, service account), and a list of Vertex AI data stores (project ID, location, datastore ID, tool name, description).
Q: Can I use multiple Vertex AI data stores? A: Yes, the solution is designed to integrate with one or multiple Vertex AI data stores, allowing you to search across various datasets.
Q: Is the Python package available on PyPI? A: Currently, the package is not published to PyPI. You can install it directly from the GitHub repository.
This is a MCP server to search documents using Vertex AI.
This solution uses Gemini with Vertex AI grounding to search documents using your private data. Grounding improves the quality of search results by grounding Gemini's responses in your data stored in Vertex AI Datastore. We can integrate one or multiple Vertex AI data stores to the MCP server. For more details on grounding, refer to Vertex AI Grounding Documentation.
There are two ways to use this MCP server. If you want to run this on Docker, the first approach would be good as Dockerfile is provided in the project.
# Clone the repository
git clone git@github.com:ubie-oss/mcp-vertexai-search.git
# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras
# Check the command
uv run mcp-vertexai-search
The package isn't published to PyPI yet, but we can install it from the repository. We need a config file derives from config.yml.template to run the MCP server, because the python package doesn't include the config template. Please refer to Appendix A: Config file for the details of the config file.
# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git
# Check the command
mcp-vertexai-search --help
# Optional: Install uv
python -m pip install -r requirements.setup.txt
# Create a virtual environment
uv venv
uv sync --all-extras
This supports two transports for SSE (Server-Sent Events) and stdio (Standard Input Output).
We can control the transport by setting the --transport
flag.
We can configure the MCP server with a YAML file. config.yml.template is a template for the config file. Please modify the config file to fit your needs.
uv run mcp-vertexai-search serve \
--config config.yml \
--transport <stdio|sse>
We can test the Vertex AI Search by using the mcp-vertexai-search search
command without the MCP server.
uv run mcp-vertexai-search search \
--config config.yml \
--query <your-query>
config.yml.template is a template for the config file.
server
server.name
: The name of the MCP servermodel
model.model_name
: The name of the Vertex AI modelmodel.project_id
: The project ID of the Vertex AI modelmodel.location
: The location of the model (e.g. us-central1)model.impersonate_service_account
: The service account to impersonatemodel.generate_content_config
: The configuration for the generate content APIdata_stores
: The list of Vertex AI data stores
data_stores.project_id
: The project ID of the Vertex AI data storedata_stores.location
: The location of the Vertex AI data store (e.g. us)data_stores.datastore_id
: The ID of the Vertex AI data storedata_stores.tool_name
: The name of the tooldata_stores.description
: The description of the Vertex AI data storePlease log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.