by yashshingvi
Databricks Genie MCP Server connects Large Language Models (LLMs) with the Databricks Genie API, enabling LLMs to interact with Databricks conversational agents, run SQL queries, and ask natural language questions directly within the Databricks environment.
Databricks Genie MCP Server is a Model Context Protocol (MCP) server designed to connect Large Language Models (LLMs) with the Databricks Genie API. This connection enables LLMs to interact with Databricks conversational agents, run SQL queries, and ask natural language questions directly within the Databricks environment.
To use Databricks Genie MCP Server, you need Python 3.7+ and a Databricks workspace with the Genie API enabled, a personal access token, and permissions to access Genie spaces and run queries. After cloning the repository, set up a virtual environment, install dependencies, and configure a .env
file with your Databricks host and token. You will also need to manually add Genie space IDs and their titles in the get_genie_space_id()
function in main.py
as the API does not currently provide a public endpoint for this. The server can be tested using the MCP inspector or by building and running Docker. It can also be integrated with Claude Desktop by installing the MCP server and adding it as a resource in Claude.
get_genie_space_id()
, get_space_info(space_id: str)
, ask_genie(space_id: str, question: str)
, and follow_up(space_id: str, conversation_id: str, question: str)
.https://
) is the host. For the token, go to your Databricks workspace, click your username -> User Settings -> Developer tab -> Manage under "Access tokens" to generate a new token.https://
, verify your personal access token is valid, check for timeouts, and ensure your query is valid for the selected space.npx @modelcontextprotocol/inspector python main.py
) or by building and running Docker. You can also integrate it with Claude Desktop..env
file secure, use minimal scope tokens with expiration, and avoid exposing the server in public-facing environments unless authenticated.A Model Context Protocol (MCP) server that connects to the Databricks Genie API, allowing LLMs to ask natural language questions, run SQL queries, and interact with Databricks conversational agents.
Clone this repository
Create and activate a virtual environment (recommended):
python -m venv .venv
source .venv/bin/activate
Install dependencies:
pip install -r requirements.txt
Create a .env file in the root directory with the following variables:
DATABRICKS_HOST=your-databricks-instance.cloud.databricks.com # Don't add https
DATABRICKS_TOKEN=your-personal-access-token
📌 Manually Adding Genie Space IDs
Note:
At this time, the Databricks Genie API does not provide a public endpoint to list all available space IDs and titles. (afaik)
As a workaround, you need to manually add the Genie space IDs and their titles in the get_genie_space_id()
function in main.py
.
You can test the MCP server using the inspector (optional but recommended):
npx @modelcontextprotocol/inspector python main.py
OR
You can directly build and run docker to test the server
Download Claude Desktop
Install Your MCP Server: From your project directory, run:
mcp install main.py
Once Server Installed
Connect in Claude
Open Claude Desktop
Click Resources → Add Resource
Select your Genie MCP Server
Start chatting with your data using natural language! 🎯
Host Your Databricks instance URL (e.g., your-instance.cloud.databricks.com) — do not include https://
Token
Go to your Databricks workspace
Click your username (top right) → User Settings
Under the Developer tab, click Manage under "Access tokens"
Generate a new token and copy it
python main.py
This will start the Genie MCP server over the stdio transport for LLM interaction.
The following MCP tools are available:
Tool Description
Common Issues
Invalid host: Ensure the host does not include https://
Token error: Make sure your personal access token is valid and has access to Genie
Timeout: Check if the Genie space is accessible and not idle/expired
No data returned: Ensure your query is valid for the selected space
Keep your .env file secure and never commit it to version control
Use minimal scope tokens with expiration whenever possible
Avoid exposing this server in public-facing environments unless authenticated
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.