by GongRzhe
APIWeaver is a FastMCP server that dynamically creates MCP servers from web API configurations. It enables seamless integration of any REST API, GraphQL endpoint, or web service into MCP-compatible tools, particularly for use with AI assistants.
APIWeaver is a FastMCP server designed to dynamically create MCP (Model Context Protocol) servers from web API configurations. This enables seamless integration of any REST API, GraphQL endpoint, or web service into MCP-compatible tools, particularly for use with AI assistants like Claude.
pip install -r requirements.txt.apiweaver run for the default STDIO transport, or apiweaver run --transport streamable-http --host 127.0.0.1 --port 8000 for the recommended Streamable HTTP transport.http://127.0.0.1:8000/mcp for HTTP transports).register_api tool to define web APIs by providing a JSON configuration. This configuration includes name, base_url, description, auth details (bearer token, API key, basic auth, custom headers), and endpoints with their name, description, method, path, and params.register_api, list_apis, unregister_api, test_api_connection, call_api, and get_api_schema.Q: What transport type should I use? A: Streamable HTTP is recommended for modern web deployments and cloud environments. STDIO is suitable for local tools and command-line usage. SSE is a legacy option.
Q: How do I test an API connection?
A: Use the test_api_connection built-in tool after registering an API.
Q: What if I encounter a 401 Unauthorized error? A: This typically means your authentication credentials are incorrect. Double-check your API keys, tokens, or username/password.
Q: Can I integrate any web API? A: Yes, APIWeaver is designed to integrate any REST API, GraphQL endpoint, or web service that can be configured with its flexible parameter and authentication options.
Q: How do I provide API configuration?
A: API configurations are provided in JSON format, specifying details like base_url, auth, and endpoints.
A FastMCP server that dynamically creates MCP (Model Context Protocol) servers from web API configurations. This allows you to easily integrate any REST API, GraphQL endpoint, or web service into an MCP-compatible tool that can be used by AI assistants like Claude.
APIWeaver supports three different transport types to accommodate various deployment scenarios:
apiweaver run or apiweaver run --transport stdioapiweaver run --transport sse --host 127.0.0.1 --port 8000http://host:port/mcpapiweaver run --transport streamable-http --host 127.0.0.1 --port 8000http://host:port/mcp# Clone or download this repository
cd ~/Desktop/APIWeaver
# Install dependencies
pip install -r requirements.txt
{
"mcpServers": {
"apiweaver": {
"command": "uvx",
"args": ["apiweaver", "run"]
}
}
}
There are several ways to run the APIWeaver server with different transport types:
1. After installation (recommended):
If you have installed the package (e.g., using pip install . from the project root after installing requirements):
# Default STDIO transport
apiweaver run
# Streamable HTTP transport (recommended for web deployments)
apiweaver run --transport streamable-http --host 127.0.0.1 --port 8000
# SSE transport (legacy compatibility)
apiweaver run --transport sse --host 127.0.0.1 --port 8000
2. Directly from the repository (for development):
# From the root of the repository
python -m apiweaver.cli run [OPTIONS]
Transport Options:
--transport: Choose from stdio (default), sse, or streamable-http--host: Host address for HTTP transports (default: 127.0.0.1)--port: Port for HTTP transports (default: 8000)--path: URL path for MCP endpoint (default: /mcp)Run apiweaver run --help for all available options.
APIWeaver is designed to expose web APIs as tools for AI assistants that support the Model Context Protocol (MCP). Here's how to use it:
Start the APIWeaver Server:
For modern MCP clients (recommended):
apiweaver run --transport streamable-http --host 127.0.0.1 --port 8000
For legacy compatibility:
apiweaver run --transport sse --host 127.0.0.1 --port 8000
For local desktop applications:
apiweaver run # Uses STDIO transport
Configure Your AI Assistant: The MCP endpoint will be available at:
http://127.0.0.1:8000/mcphttp://127.0.0.1:8000/mcpRegister APIs and Use Tools:
Once connected, use the built-in register_api tool to define web APIs, then use the generated endpoint tools.
The server provides these built-in tools:
{
"name": "my_api",
"base_url": "https://api.example.com",
"description": "Example API integration",
"auth": {
"type": "bearer",
"bearer_token": "your-token-here"
},
"headers": {
"Accept": "application/json"
},
"endpoints": [
{
"name": "list_users",
"description": "Get all users",
"method": "GET",
"path": "/users",
"params": [
{
"name": "limit",
"type": "integer",
"location": "query",
"required": false,
"default": 10,
"description": "Number of users to return"
}
]
}
]
}
{
"name": "weather",
"base_url": "https://api.openweathermap.org/data/2.5",
"description": "OpenWeatherMap API",
"auth": {
"type": "api_key",
"api_key": "your-api-key",
"api_key_param": "appid"
},
"endpoints": [
{
"name": "get_current_weather",
"description": "Get current weather for a city",
"method": "GET",
"path": "/weather",
"params": [
{
"name": "q",
"type": "string",
"location": "query",
"required": true,
"description": "City name"
},
{
"name": "units",
"type": "string",
"location": "query",
"required": false,
"default": "metric",
"enum": ["metric", "imperial", "kelvin"]
}
]
}
]
}
{
"name": "github",
"base_url": "https://api.github.com",
"description": "GitHub REST API",
"auth": {
"type": "bearer",
"bearer_token": "ghp_your_token_here"
},
"headers": {
"Accept": "application/vnd.github.v3+json"
},
"endpoints": [
{
"name": "get_user",
"description": "Get a GitHub user's information",
"method": "GET",
"path": "/users/{username}",
"params": [
{
"name": "username",
"type": "string",
"location": "path",
"required": true,
"description": "GitHub username"
}
]
}
]
}
{
"auth": {
"type": "bearer",
"bearer_token": "your-token-here"
}
}
{
"auth": {
"type": "api_key",
"api_key": "your-key-here",
"api_key_header": "X-API-Key"
}
}
{
"auth": {
"type": "api_key",
"api_key": "your-key-here",
"api_key_param": "api_key"
}
}
{
"auth": {
"type": "basic",
"username": "your-username",
"password": "your-password"
}
}
{
"auth": {
"type": "custom",
"custom_headers": {
"X-Custom-Auth": "custom-value",
"X-Client-ID": "client-123"
}
}
}
?param=value)/users/{id}){
"timeout": 60.0 // Timeout in seconds
}
{
"name": "status",
"type": "string",
"enum": ["active", "inactive", "pending"]
}
{
"name": "page",
"type": "integer",
"default": 1
}
{
"mcpServers": {
"apiweaver": {
"command": "apiweaver",
"args": ["run", "--transport", "streamable-http", "--host", "127.0.0.1", "--port", "8000"]
}
}
}
{
"mcpServers": {
"apiweaver": {
"command": "apiweaver",
"args": ["run"]
}
}
}
The server provides detailed error messages for:
streamable-http for modern deployments, stdio for local toolstest_api_connection after registering an APIRun with verbose logging (if installed):
apiweaver run --verbose
Feel free to extend this server with additional features:
MIT License - feel free to use and modify as needed.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.