by JordiNeil
mcp-databricks-server is a Model Context Protocol (MCP) server that connects Large Language Models (LLMs) with the Databricks API. It enables LLMs to interact with Databricks for tasks such as running SQL queries and managing jobs.
mcp-databricks-server is a Model Context Protocol (MCP) server designed to connect Large Language Models (LLMs) with the Databricks API. This allows LLMs to interact with Databricks for tasks such as running SQL queries, listing jobs, and retrieving job statuses.
To use mcp-databricks-server, you need Python 3.7+ and a Databricks workspace with a personal access token, a SQL warehouse endpoint, and appropriate permissions. After cloning the repository, set up a virtual environment, install dependencies from requirements.txt, and configure a .env file with your Databricks host, token, and HTTP path. You can then run the server using python main.py and test it with the MCP inspector.
This project enables natural language interaction with your Databricks environment through LLMs. Example use cases include:
Q: What are the prerequisites for running this server? A: Python 3.7+, a Databricks workspace with a personal access token, a SQL warehouse endpoint, and permissions to run queries and access jobs.
Q: How do I obtain Databricks credentials? A: Your Databricks instance URL is the host. A personal access token can be generated in your Databricks User Settings under the "Developer" tab. The HTTP Path for your SQL warehouse can be found in its connection details.
Q: What if I encounter connection issues?
A: Ensure your Databricks host is correct (without https://), your SQL warehouse is running, your personal access token has necessary permissions, and try running the test_connection.py script.
Q: What are the security considerations?
A: Your Databricks personal access token provides direct access. Secure your .env file and never commit it to version control. Consider using tokens with appropriate permission scopes and running the server in a secure environment.
A Model Context Protocol (MCP) server that connects to Databricks API, allowing LLMs to run SQL queries, list jobs, and get job status.
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -r requirements.txt
.env file in the root directory with the following variables:
DATABRICKS_HOST=your-databricks-instance.cloud.databricks.com
DATABRICKS_TOKEN=your-personal-access-token
DATABRICKS_HTTP_PATH=/sql/1.0/warehouses/your-warehouse-id
python test_connection.py
your-instance.cloud.databricks.com)Start the MCP server:
python main.py
You can test the MCP server using the inspector by running
npx @modelcontextprotocol/inspector python3 main.py
The following MCP tools are available:
When used with LLMs that support the MCP protocol, this server enables natural language interaction with your Databricks environment:
https:// prefixpython test_connection.py.env file and never commit it to version controlPlease log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.