by Bankless
Provides a Model Context Protocol server enabling AI models to query on‑chain data via the Bankless API.
Enables AI models to retrieve and interact with blockchain state, contract source code, ABI, events, and transaction history through a standardized MCP interface.
export BANKLESS_API_TOKEN=your_api_token_here
npx @bankless/onchain-mcp
read_contract
, get_events
, build_event_topic
) from any AI model or application that supports MCP.read_contract
, get_proxy
, get_abi
, get_source
.get_events
, build_event_topic
.get_transaction_history
, get_transaction_info
.BanklessValidationError
, BanklessAuthenticationError
, BanklessResourceNotFoundError
, BanklessRateLimitError
).npx
.BANKLESS_API_TOKEN
environment variable; the server returns BanklessAuthenticationError
for invalid or missing tokens.BanklessRateLimitError
.npm run debug
after cloning the repository.MCP (Model Context Protocol) server for blockchain data interaction through the Bankless API.
The Bankless Onchain MCP Server provides a framework for interacting with on-chain data via the Bankless API. It implements the Model Context Protocol (MCP) to allow AI models to access blockchain state and event data in a structured way.
https://github.com/user-attachments/assets/95732dff-ae5f-45a6-928a-1ae17c0ddf9d
The server provides the following onchain data operations:
Read Contract State (read_contract
): Read state from smart contracts on various blockchain networks.
Get Proxy (get_proxy
): Retrieve proxy implementation contract addresses.
Get ABI (get_abi
): Fetch the ABI (Application Binary Interface) for a contract.
Get Source (get_source
): Retrieve the source code for a verified contract.
Get Events (get_events
): Fetch event logs for a contract based on topics.
Build Event Topic (build_event_topic
): Generate an event topic signature from event name and argument types.
Get Transaction History (get_transaction_history
): Retrieve transaction history for a user address.
Get Transaction Info (get_transaction_info
): Get detailed information about a specific transaction.
read_contract
network
(string, required): The blockchain network (e.g., "ethereum", "polygon")contract
(string, required): The contract addressmethod
(string, required): The contract method to callinputs
(array, required): Input parameters for the method call, each containing:
type
(string): The type of the input parameter (e.g., "address", "uint256")value
(any): The value of the input parameteroutputs
(array, required): Expected output types, each containing:
type
(string): The expected output typeget_proxy
network
(string, required): The blockchain network (e.g., "ethereum", "base")contract
(string, required): The contract addressget_events
network
(string, required): The blockchain network (e.g., "ethereum", "base")addresses
(array, required): List of contract addresses to filter eventstopic
(string, required): Primary topic to filter eventsoptionalTopics
(array, optional): Optional additional topics (can include null values)build_event_topic
network
(string, required): The blockchain network (e.g., "ethereum", "base")name
(string, required): Event name (e.g., "Transfer(address,address,uint256)")arguments
(array, required): Event arguments types, each containing:
type
(string): The argument type (e.g., "address", "uint256")npm install @bankless/onchain-mcp
Before using the server, set your Bankless API token. For details on how to obtain your Bankless API token, head to https://docs.bankless.com/bankless-api/other-services/onchain-mcp
export BANKLESS_API_TOKEN=your_api_token_here
The server can be run directly from the command line:
npx @bankless/onchain-mcp
This server implements the Model Context Protocol (MCP), which allows it to be used as a tool provider for compatible AI models. Here are some example calls for each tool:
// Example call
{
"name": "read_contract",
"arguments": {
"network": "ethereum",
"contract": "0x1234...",
"method": "balanceOf",
"inputs": [
{ "type": "address", "value": "0xabcd..." }
],
"outputs": [
{ "type": "uint256" }
]
}
}
// Example response
[
{
"value": "1000000000000000000",
"type": "uint256"
}
]
// Example call
{
"name": "get_proxy",
"arguments": {
"network": "ethereum",
"contract": "0x1234..."
}
}
// Example response
{
"implementation": "0xefgh..."
}
// Example call
{
"name": "get_events",
"arguments": {
"network": "ethereum",
"addresses": ["0x1234..."],
"topic": "0xabcd...",
"optionalTopics": ["0xef01...", null]
}
}
// Example response
{
"result": [
{
"removed": false,
"logIndex": 5,
"transactionIndex": 2,
"transactionHash": "0x123...",
"blockHash": "0xabc...",
"blockNumber": 12345678,
"address": "0x1234...",
"data": "0x...",
"topics": ["0xabcd...", "0xef01...", "0x..."]
}
]
}
// Example call
{
"name": "build_event_topic",
"arguments": {
"network": "ethereum",
"name": "Transfer(address,address,uint256)",
"arguments": [
{ "type": "address" },
{ "type": "address" },
{ "type": "uint256" }
]
}
}
// Example response
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef"
# Clone the repository
git clone https://github.com/Bankless/onchain-mcp.git
cd onchain-mcp
# Install dependencies
npm install
# Build the project
npm run build
npm run debug
To integrate this server with AI applications that support MCP, add the following to your app's server configuration:
{
"mcpServers": {
"bankless": {
"command": "npx",
"args": [
"@bankless/onchain-mcp"
],
"env": {
"BANKLESS_API_TOKEN": "your_api_token_here"
}
}
}
}
The server provides specific error types for different scenarios:
BanklessValidationError
: Invalid input parametersBanklessAuthenticationError
: API token issuesBanklessResourceNotFoundError
: Requested resource not foundBanklessRateLimitError
: API rate limit exceededIn order to guide an LLM model to use the Bankless Onchain MCP Server, the following prompts can be used:
ROLE:
• You are Kompanion, a blockchain expert and EVM sleuth.
• You specialize in navigating and analyzing smart contracts using your tools and resources.
HOW KOMPANION CAN HANDLE PROXY CONTRACTS:
• If a contract is a proxy, call your “get_proxy” tool to fetch the implementation contract.
• If that fails, try calling the “implementation” method on the proxy contract.
• If that also fails, try calling the “_implementation” function.
• After obtaining the implementation address, call “get_contract_source” with that address to fetch its source code.
• When reading or modifying the contract state, invoke implementation functions on the proxy contract address (not directly on the implementation).
HOW KOMPANION CAN HANDLE EVENTS:
• Get the ABI and Source of the relevant contracts
• From the event types in the ABI, construct the correct topics for the event relevant to the question
• use the "get_event_logs" tool to fetch logs for the contract
KOMPANION'S RULES:
• Do not begin any response with “Great,” “Certainly,” “Okay,” or “Sure.”
• Maintain a direct, technical style. Do not add conversational flourishes.
• If the user’s question is unrelated to smart contracts, do not fetch any contracts.
• If you navigate contracts, explain each step in bullet points.
• Solve tasks iteratively, breaking them into steps.
• Use bullet points for lists of steps.
• Never assume a contract’s functionality. Always verify with examples using your tools to read the contract state.
• Before responding, consider which tools might help you gather better information.
• Include as much relevant information as possible in your final answer, depending on your findings.
HOW KOMPANION CAN USE TOOLS:
• You can fetch contract source codes, ABIs, and read contract data by using your tools and functions.
• Always verify the source or ABI to understand the contract rather than making assumptions.
• If you need to read contract state, fetch its ABI (especially if the source is lengthy).
FINAL INSTRUCTION:
• Provide the best possible, concise answer to the user’s request. If it's not an immediate question but an instruction, follow it directly.
• Use your tools to gather any necessary clarifications or data.
• Offer a clear, direct response and add a summary of what you did (how you navigated the contracts) at the end.
MIT
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "bankless": { "command": "npx", "args": [ "@bankless/onchain-mcp" ], "env": { "BANKLESS_API_TOKEN": "<YOUR_API_TOKEN>" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.