by HyperbolicLabs
Provides agents and large language models with direct access to Hyperbolic's GPU cloud, enabling listing, renting, managing, and SSH-ing into GPU instances for executing GPU‑powered workloads.
Enables seamless interaction with Hyperbolic's GPU cloud from within Claude or other AI agents. It exposes tools for discovering available GPUs, renting instances, terminating them, and establishing SSH connections for remote command execution.
git clone <repo-url>
cd hyperbolic-mcp
npm install
npm run build
npm start
HYPERBOLIC_API_TOKEN
and SSH_PRIVATE_KEY_PATH
either in the env
block or a .env
file.list-available-gpus
, rent-gpu-instance
) within a conversation to manage GPU resources.nvidia-smi
..env
file and never transmitted externally.Interact with Hyperbolic's GPU cloud, enabling agents and LLMs to view and rent available GPUs, SSH into them, and run GPU-powered workloads for you.
https://github.com/user-attachments/assets/814d0327-ce5e-4c1b-90bc-7f3712aa1c68
Register for a Hyperbolic account:
Deposit funds into your account:
Generate an API token:
Add your SSH public key:
Clone this repository:
git clone <your-repo-url>
cd hyperbolic-mcp
Install dependencies:
npm install
Build the TypeScript files:
npm run build
To run the server:
npm start
{
"mcpServers": {
"hyperbolic-gpu": {
"command": "node",
"args": ["/path/to/hyperbolic-mcp-server/build/index.js"],
"env": {
"HYPERBOLIC_API_TOKEN": "your-hyperbolic-api-token",
"SSH_PRIVATE_KEY_PATH": "/path/to/your/privatekey"
}
}
}
}
Restart Claude for Desktop.
Start a new conversation and interact with the server.
Note: You can provide environment variables either through the Claude Desktop config as shown above, or by creating a .env
file in the project root. The .env
file is only needed if you're not providing the variables through the config.
The server provides the following tools:
Lists all available GPUs on the Hyperbolic network.
Example query: "Show me all available GPUs on Hyperbolic."
Rents a GPU instance from a specific cluster.
Parameters:
cluster_name
: The name of the cluster to rent (e.g., "extrasmall-chamomile-duck")node_name
: The name of the node (e.g., "prd-acl-msi-02.fen.intra")gpu_count
: Number of GPUs to rentExample query: "I want to rent 4 GPUs from the extrasmall-chamomile-duck cluster."
Terminates a GPU instance that you have rented.
Parameters:
instance_id
: The ID of the instance to terminateExample query: "Terminate my GPU instance with ID abc123."
Lists all active GPU instances that you have rented.
Example query: "Show me all my active GPU instances."
Gets detailed information about a specific cluster.
Parameters:
cluster_name
: The name of the cluster to get details forExample query: "Tell me more about the cluster called extrasmall-chamomile-duck."
Establishes an SSH connection to a remote server.
Parameters:
host
: Hostname or IP address of the remote serverusername
: SSH username for authenticationpassword
: (Optional) SSH password for authenticationprivate_key_path
: (Optional) Path to private key fileport
: (Optional) SSH port number (default: 22)Example query: "Connect to my GPU instance at 192.168.1.100 as user admin."
Executes a command on the connected remote server.
Parameters:
command
: Command to execute on the remote serverExample query: "Run 'nvidia-smi' on the connected server."
Checks the current SSH connection status.
Example query: "What's the status of my SSH connection?"
Closes the active SSH connection.
Example query: "Disconnect from the SSH server."
.env
fileIf you encounter issues:
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "hyperbolic-gpu": { "command": "node", "args": [ "/path/to/hyperbolic-mcp-server/build/index.js" ], "env": { "HYPERBOLIC_API_TOKEN": "your-hyperbolic-api-token", "SSH_PRIVATE_KEY_PATH": "/path/to/your/privatekey" } } } }
Discover more MCP servers with similar functionality and use cases
by daytonaio
Provides a secure, elastic sandbox environment for executing AI‑generated code with isolated runtimes and sub‑90 ms provisioning.
by awslabs
Specialized servers that expose AWS capabilities through the Model Context Protocol, allowing AI assistants and other applications to retrieve up‑to‑date AWS documentation, manage infrastructure, query services, and perform workflow automation directly from their context.
by awslabs
AWS MCP Servers allow AI agents to interact with and manage a wide range of AWS services using natural language commands. They enable AI-powered cloud management, automated DevOps, and data-driven insights within the AWS ecosystem.
by cloudflare
Remote Model Context Protocol endpoints that let AI clients read, process, and act on data across Cloudflare services such as Workers, Radar, Observability, and more.
by supabase-community
Enables AI assistants to interact directly with Supabase projects, allowing them to query databases, fetch configuration, manage tables, and perform other project‑level operations.
by Azure
azure-mcp is a server that implements the Model Context Protocol (MCP) to connect AI agents with Azure services. It allows developers to interact with Azure resources like Storage, Cosmos DB, and the Azure CLI using natural language commands within their development environment.
by Flux159
MCP Server for Kubernetes management commands, enabling interaction with Kubernetes clusters to manage pods, deployments, and services.
by strowk
Provides a Golang‑based server that enables interaction with Kubernetes clusters via prompts, allowing listing of contexts, namespaces, resources, nodes, pods, events, logs, and executing commands inside pods.
by jamsocket
Run arbitrary Python code securely in persistent, stateful sandboxes that remain available indefinitely.