by syucream
A MCP (Model Context Protocol) server that accesses Lightdash, enabling AI assistants to interact with Lightdash data.
lightdash-mcp-server is a Model Context Protocol (MCP) server designed to provide AI assistants with standardized access to Lightdash's API. This allows AI models to interact with your Lightdash data, enabling them to retrieve information, list projects, charts, dashboards, and more, all through a unified interface.
lightdash-mcp-server can be installed and configured in two primary ways: via Smithery for automatic installation with Claude Desktop, or manually via npm. It supports both Stdio and HTTP transport modes.
npx -y @smithery/cli install lightdash-mcp-server --client claude
npm install lightdash-mcp-server
Set the following environment variables:
LIGHTDASH_API_KEY
: Your Lightdash Personal Access Token (PAT)LIGHTDASH_API_URL
: Your Lightdash API base URLStdio Transport (Default):
npx lightdash-mcp-server
command
and args
fields to execute the server.HTTP Transport (Streamable HTTP):
npx lightdash-mcp-server -port 8080
url
field, pointing to http://localhost:8080/mcp
.@modelcontextprotocol/sdk/client/streamableHttp.js
.lightdash-mcp-server provides a set of tools that allow AI assistants to interact with Lightdash data:
list_projects
: List all projects in the Lightdash organization.get_project
: Get details of a specific project.list_spaces
: List all spaces in a project.list_charts
: List all charts in a project.list_dashboards
: List all dashboards in a project.get_custom_metrics
: Get custom metrics for a project.get_catalog
: Get catalog for a project.get_metrics_catalog
: Get metrics catalog for a project.get_charts_as_code
: Get charts as code for a project.get_dashboards_as_code
: Get dashboards as code for a project.Q: What is MCP (Model Context Protocol)? A: MCP is a protocol that allows AI models to interact with external tools and data sources in a standardized way.
Q: Can I use lightdash-mcp-server with other AI clients besides Claude Desktop? A: Yes, any MCP-compatible client can be configured to work with lightdash-mcp-server.
Q: How do I get my Lightdash API Key? A: You can generate a Personal Access Token (PAT) from your Lightdash account settings.
Q: What is the difference between Stdio and HTTP transport modes? A: Stdio (Standard I/O) mode communicates over the command line, while HTTP mode exposes an API endpoint for communication. HTTP mode is suitable for programmatic access and streaming.
Q: Where can I find examples for programmatic access?
A: Refer to examples/list_spaces_http.ts
in the project repository for a complete example of connecting to the HTTP server programmatically.
A MCP(Model Context Protocol) server that accesses to Lightdash.
This server provides MCP-compatible access to Lightdash's API, allowing AI assistants to interact with your Lightdash data through a standardized interface.
Available tools:
list_projects
- List all projects in the Lightdash organizationget_project
- Get details of a specific projectlist_spaces
- List all spaces in a projectlist_charts
- List all charts in a projectlist_dashboards
- List all dashboards in a projectget_custom_metrics
- Get custom metrics for a projectget_catalog
- Get catalog for a projectget_metrics_catalog
- Get metrics catalog for a projectget_charts_as_code
- Get charts as code for a projectget_dashboards_as_code
- Get dashboards as code for a projectTo install Lightdash MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install lightdash-mcp-server --client claude
npm install lightdash-mcp-server
LIGHTDASH_API_KEY
: Your Lightdash PATLIGHTDASH_API_URL
: The API base URLThe lightdash-mcp-server supports two transport modes: Stdio (default) and HTTP.
npx lightdash-mcp-server
...
"lightdash": {
"command": "npx",
"args": [
"-y",
"lightdash-mcp-server"
],
"env": {
"LIGHTDASH_API_KEY": "<your PAT>",
"LIGHTDASH_API_URL": "https://<your base url>"
}
},
...
npx lightdash-mcp-server -port 8080
This starts the server using StreamableHTTPServerTransport, making it accessible via HTTP at http://localhost:8080/mcp
.
For Claude Desktop and other MCP clients:
Edit your MCP configuration json to use the url
field instead of command
and args
:
...
"lightdash": {
"url": "http://localhost:8080/mcp"
},
...
For programmatic access:
Use the streamable HTTP client transport:
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
const client = new Client({
name: 'my-client',
version: '1.0.0'
}, {
capabilities: {}
});
const transport = new StreamableHTTPClientTransport(
new URL('http://localhost:8080/mcp')
);
await client.connect(transport);
Note: When using HTTP mode, ensure the environment variables LIGHTDASH_API_KEY
and LIGHTDASH_API_URL
are set in the environment where the server is running, as they cannot be passed through MCP client configuration.
See examples/list_spaces_http.ts
for a complete example of connecting to the HTTP server programmatically.
npm run dev
- Start the server in development mode with hot reloading (stdio transport)npm run dev:http
- Start the server in development mode with HTTP transport on port 8080npm run build
- Build the project for productionnpm run start
- Start the production servernpm run lint
- Run linting checks (ESLint and Prettier)npm run fix
- Automatically fix linting issuesnpm run examples
- Run the example scriptsnpm run lint
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.