by asusevski
A Model Context Protocol (MCP) server implementation for accessing OpenDota API data, enabling LLMs and AI assistants to retrieve real-time Dota 2 statistics.
OpenDota MCP Server is a Model Context Protocol (MCP) server designed to provide large language models (LLMs) and AI assistants with access to real-time Dota 2 game data via the OpenDota API. It acts as an intermediary, translating requests from AI models into OpenDota API calls and returning structured game information.
Installation:
git clone https://github.com/asusevski/opendota-mcp-server.gitcd opendota-mcp-server./scripts/setup_env.sh or manual installation with uv add pyproject.toml.uv pip install -e ".[dev]".API Key Setup:
export OPENDOTA_API_KEY=your_api_key_here.Running the Server:
python -m src.opendota_server.server.claude_desktop_config.json with the appropriate command for your environment (e.g., WSL).Using the Client:
python -m src.client.OpenDota MCP Server offers a wide range of functionalities for accessing Dota 2 data, including:
Q: Do I need an OpenDota API key to use this server? A: While optional, it is highly recommended to obtain an OpenDota API key for optimal usage and to avoid rate limits.
Q: Can I use this with other LLMs besides Claude Desktop? A: Yes, as an MCP server, it is designed to work with any LLM or AI assistant that supports the Model Context Protocol.
Q: What kind of Dota 2 data can I access? A: You can access a comprehensive range of data including player stats, match details, hero information, team data, and more, as outlined in the "Specific tools included" section of the README.
A Model Context Protocol (MCP) server implementation for accessing OpenDota API data. This server enables LLMs and AI assistants to retrieve real-time Dota 2 statistics, match data, player information, and more through a standard interface.
# Clone the repository
git clone https://github.com/asusevski/opendota-mcp-server.git
cd opendota-mcp-server
# Option 1: Automated setup (works with bash, zsh, and other shells)
./scripts/setup_env.sh
# Option 2: Manual installation with uv
uv add pyproject.toml
# For development dependencies
uv pip install -e ".[dev]"
export OPENDOTA_API_KEY=your_api_key_here
python -m src.opendota_server.server
Follow this: https://modelcontextprotocol.io/quickstart/user
If you use WSL, assuming you have cloned the repo and set up the python environment, this is how I wrote the claude_desktop_config.json:
{
"mcpServers": {
"opendota": {
"command": "wsl.exe",
"args": [
"--",
"bash",
"-c",
"cd ~/opendota-mcp-server && source .venv/bin/activate && python src/opendota_server/server.py"
]
}
}
}
python -m src.client
MIT
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.