by GongRzhe
A2A-MCP-Server is a Python-based Model Context Protocol (MCP) server that bridges MCP with the Agent-to-Agent (A2A) protocol. It enables MCP-compatible AI assistants like Claude to seamlessly interact with various Google A2A agents, extending their capabilities.
A2A-MCP-Server is a Python-based Model Context Protocol (MCP) server that acts as a bridge between the Model Context Protocol (MCP) and the Agent-to-Agent (A2A) protocol. This enables MCP-compatible AI assistants, such as Claude, to seamlessly interact with various A2A agents developed by Google.
Via Smithery (for Claude Desktop):
npx -y @smithery/cli install @GongRzhe/A2A-MCP-Server --client claude
From PyPI:
pip install a2a-mcp-server
Local Installation:
git clone https://github.com/GongRzhe/A2A-MCP-Server.git && cd A2A-MCP-Serverpython -m venv .venv && source .venv/bin/activatepip install -r requirements.txtConfigure the server using environment variables like MCP_TRANSPORT (stdio, streamable-http, sse), MCP_HOST, MCP_PORT, etc.
From Command Line:
uvx a2a-mcp-server
# Or with HTTP transport
MCP_TRANSPORT=streamable-http MCP_HOST=127.0.0.1 MCP_PORT=8080 uvx a2a-mcp-server
Modify claude_desktop_config.json (located in %APPDATA%\Claude, ~/Library/Application Support/Claude, or ~/.config/Claude) to include the A2A-MCP-Server configuration. A config_creator.py script is provided to assist with this.
Start the A2A-MCP-Server (often with streamable-http transport for web clients) and configure your MCP client to connect to the server's MCP URL (e.g., http://127.0.0.1:8000/mcp).
stdio, streamable-http, and sse transport types.register_agent, list_agents, send_message, get_task_result, etc., for LLM integration.Q: How do I troubleshoot agent registration issues?
A: Verify the agent URL is correct and accessible, and check for a proper /.well-known/agent.json file.
Q: What if messages aren't being delivered?
A: Ensure the agent is registered (use list_agents) and that it is running and accessible.
Q: Why can't I retrieve a task result?
A: Make sure you are using the correct task_id. Be aware that some agents might discard old tasks.
Q: I'm having issues with a specific transport type.
A: For stdio, ensure input/output streams are not redirected. For streamable-http, check port availability and firewall. For sse, verify client support for Server-Sent Events.
Q: Claude Desktop isn't starting the server. What should I do?
A: Check paths in claude_desktop_config.json, ensure Python is in your PATH, verify PYTHONPATH for local installations, confirm MCP_TRANSPORT is set to stdio, and try running the command manually. The config_creator.py script can help.
A mcp server that bridges the Model Context Protocol (MCP) with the Agent-to-Agent (A2A) protocol, enabling MCP-compatible AI assistants (like Claude) to seamlessly interact with A2A agents.
This project serves as an integration layer between two cutting-edge AI agent protocols:
Model Context Protocol (MCP): Developed by Anthropic, MCP allows AI assistants to connect to external tools and data sources. It standardizes how AI applications and large language models connect to external resources in a secure, composable way.
Agent-to-Agent Protocol (A2A): Developed by Google, A2A enables communication and interoperability between different AI agents through a standardized JSON-RPC interface.
By bridging these protocols, this server allows MCP clients (like Claude) to discover, register, communicate with, and manage tasks on A2A agents through a unified interface.

also support cloud deployed Agent


Agent Management
Communication
Task Management
Transport Support
To install A2A Bridge Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @GongRzhe/A2A-MCP-Server --client claude
pip install a2a-mcp-server
Clone the repository:
git clone https://github.com/GongRzhe/A2A-MCP-Server.git
cd A2A-MCP-Server
Set up a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
Install dependencies:
pip install -r requirements.txt
Configure how the MCP server runs using these environment variables:
# Transport type: stdio, streamable-http, or sse
export MCP_TRANSPORT="streamable-http"
# Host for the MCP server
export MCP_HOST="0.0.0.0"
# Port for the MCP server (when using HTTP transports)
export MCP_PORT="8000"
# Path for the MCP server endpoint (when using HTTP transports)
export MCP_PATH="/mcp"
# Path for SSE endpoint (when using SSE transport)
export MCP_SSE_PATH="/sse"
# Enable debug logging
export MCP_DEBUG="true"
The A2A MCP Server supports multiple transport types:
stdio (default): Uses standard input/output for communication
streamable-http (recommended for web clients): HTTP transport with streaming support
sse: Server-Sent Events transport
To specify the transport type:
# Using environment variable
export MCP_TRANSPORT="streamable-http"
uvx a2a-mcp-server
# Or directly in the command
MCP_TRANSPORT=streamable-http uvx a2a-mcp-server
# Using default settings (stdio transport)
uvx a2a-mcp-server
# Using HTTP transport on specific host and port
MCP_TRANSPORT=streamable-http MCP_HOST=127.0.0.1 MCP_PORT=8080 uvx a2a-mcp-server
Claude Desktop allows you to configure MCP servers in the claude_desktop_config.json file. This file is typically located at:
%APPDATA%\Claude\claude_desktop_config.json~/Library/Application Support/Claude/claude_desktop_config.json~/.config/Claude/claude_desktop_config.jsonAdd the following to the mcpServers section of your claude_desktop_config.json:
"a2a": {
"command": "uvx",
"args": [
"a2a-mcp-server"
]
}
Note that for Claude Desktop, you must use "MCP_TRANSPORT": "stdio" since Claude requires stdio communication with MCP servers.
If you've cloned the repository and want to run the server from your local installation:
"a2a": {
"command": "C:\\path\\to\\python.exe",
"args": [
"C:\\path\\to\\A2A-MCP-Server\\a2a_mcp_server.py"
],
"env": {
"MCP_TRANSPORT": "stdio",
"PYTHONPATH": "C:\\path\\to\\A2A-MCP-Server"
}
}
Replace C:\\path\\to\\ with the actual paths on your system.
This repository includes a config_creator.py script to help you generate the configuration:
# If using local installation
python config_creator.py
The script will:
Here's an example of a complete claude_desktop_config.json file with the A2A-MCP-Server configured:
{
"mcpServers": {
"a2a": {
"command": "uvx",
"args": [
"a2a-mcp-server"
]
}
}
}
Claude can use A2A agents through the MCP tools provided by this server. Here's how to set it up:
For Claude Web: Start the MCP server with the streamable-http transport:
MCP_TRANSPORT=streamable-http MCP_HOST=127.0.0.1 MCP_PORT=8000 uvx a2a-mcp-server
For Claude Web: In Claude web interface, enable the MCP URL connection in your Tools menu.
http://127.0.0.1:8000/mcpFor Claude Desktop: Add the configuration to your claude_desktop_config.json file as described above. The easiest way is to use the provided config_creator.py script which will automatically detect paths and create the proper configuration.
In Claude, you can now use the following functions:
Register an A2A agent:
I need to register a new agent. Can you help me with that?
(Agent URL: http://localhost:41242)
Send message to an agent:
Ask the agent at http://localhost:41242 what it can do.
Retrieve task results:
Can you get the results for task ID: 550e8400-e29b-41d4-a716-446655440000?
Cursor IDE can connect to MCP servers to add tools to its AI assistant:
Run your A2A MCP server with the streamable-http transport:
MCP_TRANSPORT=streamable-http MCP_HOST=127.0.0.1 MCP_PORT=8000 uvx a2a-mcp-server
In Cursor IDE, go to Settings > AI > MCP Servers
http://127.0.0.1:8000/mcpNow you can use the A2A tools from within Cursor's AI assistant.
Windsurf is a browser with built-in MCP support:
Run your A2A MCP server with the streamable-http transport:
MCP_TRANSPORT=streamable-http MCP_HOST=127.0.0.1 MCP_PORT=8000 uvx a2a-mcp-server
In Windsurf browser, go to Settings > MCP Connections
http://127.0.0.1:8000/mcpYou can now use A2A tools from within Windsurf's AI assistant.
The server exposes the following MCP tools for integration with LLMs like Claude:
register_agent: Register an A2A agent with the bridge server
{
"name": "register_agent",
"arguments": {
"url": "http://localhost:41242"
}
}
list_agents: Get a list of all registered agents
{
"name": "list_agents",
"arguments": {}
}
unregister_agent: Remove an A2A agent from the bridge server
{
"name": "unregister_agent",
"arguments": {
"url": "http://localhost:41242"
}
}
send_message: Send a message to an agent and get a task_id for the response
{
"name": "send_message",
"arguments": {
"agent_url": "http://localhost:41242",
"message": "What's the exchange rate from USD to EUR?",
"session_id": "optional-session-id"
}
}
send_message_stream: Send a message and stream the response
{
"name": "send_message_stream",
"arguments": {
"agent_url": "http://localhost:41242",
"message": "Tell me a story about AI agents.",
"session_id": "optional-session-id"
}
}
get_task_result: Retrieve a task's result using its ID
{
"name": "get_task_result",
"arguments": {
"task_id": "b30f3297-e7ab-4dd9-8ff1-877bd7cfb6b1",
"history_length": null
}
}
cancel_task: Cancel a running task
{
"name": "cancel_task",
"arguments": {
"task_id": "b30f3297-e7ab-4dd9-8ff1-877bd7cfb6b1"
}
}
1. Client registers an A2A agent
↓
2. Client sends a message to the agent (gets task_id)
↓
3. Client retrieves the task result using task_id
User: Register an agent at http://localhost:41242
Claude uses: register_agent(url="http://localhost:41242")
Claude: Successfully registered agent: ReimbursementAgent
User: Ask the agent what it can do
Claude uses: send_message(agent_url="http://localhost:41242", message="What can you do?")
Claude: I've sent your message. Here's the task_id: b30f3297-e7ab-4dd9-8ff1-877bd7cfb6b1
User: Get the answer to my question
Claude uses: get_task_result(task_id="b30f3297-e7ab-4dd9-8ff1-877bd7cfb6b1")
Claude: The agent replied: "I can help you process reimbursement requests. Just tell me what you need to be reimbursed for, including the date, amount, and purpose."
The A2A MCP server consists of several key components:
MCP Client → FastMCP Server → A2A Client → A2A Agent
↑ ↓
└──── Response ──┘
When sending a message to an A2A agent, the server:
task_idtask_agent_mapping dictionarytask_id to the MCP clientThe server provides detailed error messages for common issues:
If an agent can't be registered:
/.well-known/agent.jsonIf messages aren't being delivered:
list_agents)If you can't retrieve a task result:
If you have issues with a specific transport type:
If Claude Desktop isn't starting your A2A-MCP-Server:
claude_desktop_config.json are correct"command": "python"MCP_TRANSPORT is set to "stdio" in the env sectionconfig_creator.py script for automatic path detection and configurationTo add new capabilities to the server, add methods decorated with @mcp.tool() in the a2a_mcp_server.py file.
The server uses a custom A2AServerTaskManager class that extends InMemoryTaskManager. You can customize its behavior by modifying this class.
a2a-mcp-server/
├── a2a_mcp_server.py # Main server implementation
├── common/ # A2A protocol code (from google/A2A)
│ ├── client/ # A2A client implementation
│ ├── server/ # A2A server implementation
│ ├── types.py # Common type definitions
│ └── utils/ # Utility functions
├── config_creator.py # Script to help create Claude Desktop configuration
├── .gitignore # Git ignore file
├── pyproject.toml # Project metadata and dependencies
├── README.md # This file
└── requirements.txt # Project dependencies
This project is licensed under the Apache License, Version 2.0 - see the LICENSE file for details.
The code in the common/ directory is from the Google A2A project and is also licensed under the Apache License, Version 2.0.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.