by yepcode
Enables AI platforms to run YepCode processes, execute code, manage environment variables, and interact with storage through the Model Context Protocol.
Provides an MCP server that connects AI assistants to YepCode's infrastructure, allowing real‑time execution of scripts, environment management, file storage operations, and direct invocation of YepCode processes.
run_code
, set_env_var
, list_files
, run_ycp_<process_slug>
, etc.) from the AI assistant.mcp-tool
into an MCP tool (run_ycp_<process_slug>
).run_code
tool or keep source code after execution via YEPCODE_MCP_OPTIONS
.Q: Do I need to install Docker if I use NPX?
A: No. NPX runs the server directly with Node.js; Docker is an alternative for containerized deployments.
Q: Which environment variable holds the API token?
A: YEPCODE_API_TOKEN
(or pass it as an Authorization: Bearer <token>
header for HTTP endpoints).
Q: Can I disable the run_code
tool?
A: Yes, add disableRunCodeTool
to YEPCODE_MCP_OPTIONS
.
Q: How do I keep the executed source code for audit purposes?
A: Include runCodeCleanup
in YEPCODE_MCP_OPTIONS
to prevent automatic cleanup.
Q: What languages are supported for run_code
?
A: JavaScript is default; other languages can be specified via the language
option if supported by YepCode.
An MCP (Model Context Protocol) server that enables AI platforms to interact with YepCode's infrastructure. Run LLM generated scripts and turn your YepCode processes into powerful tools that AI assistants can use directly.
YepCode MCP server can be integrated with AI platforms like Cursor or Claude Desktop using either a remote approach (we offer a hosted version of the MCP server) or a local approach (NPX or Docker installation is required).
For both approaches, you need to get your YepCode API credentials:
Settings
> API credentials
to create a new API token.{
"mcpServers": {
"yepcode-mcp-server": {
"url": "https://cloud.yepcode.io/mcp/sk-c2E....RD/sse"
}
}
}
{
"mcpServers": {
"yepcode-mcp-server": {
"url": "https://cloud.yepcode.io/mcp/sse",
"headers": {
"Authorization": "Bearer <sk-c2E....RD>"
}
}
}
}
Make sure you have Node.js installed (version 18 or higher), and use a configuration similar to the following:
{
"mcpServers": {
"yepcode-mcp-server": {
"command": "npx",
"args": ["-y", "@yepcode/mcp-server"],
"env": {
"YEPCODE_API_TOKEN": "your_api_token_here"
}
}
}
}
docker build -t yepcode/mcp-server .
{
"mcpServers": {
"yepcode-mcp-server": {
"command": "docker",
"args": [
"run",
"-d",
"-e",
"YEPCODE_API_TOKEN=your_api_token_here",
"yepcode/mcp-server"
]
}
}
}
Debugging MCP servers can be tricky since they communicate over stdio. To make this easier, we recommend using the MCP Inspector, which you can run with the following command:
npm run inspector
This will start a server where you can access debugging tools directly in your browser.
The MCP server provides several tools to interact with YepCode's infrastructure:
Executes code in YepCode's secure environment.
// Input
{
code: string; // The code to execute
options?: {
language?: string; // Programming language (default: 'javascript')
comment?: string; // Execution context
settings?: Record<string, unknown>; // Runtime settings
}
}
// Response
{
returnValue?: unknown; // Execution result
logs?: string[]; // Console output
error?: string; // Error message if execution failed
}
MCP Options
YepCode MCP server supports the following options:
run_code
tool. For example, if you want to use the MCP server as a provider only for the existing tools in your YepCode account.Options can be passed as a comma-separated list in the YEPCODE_MCP_OPTIONS
environment variable or as a query parameter in the MCP server URL.
// SSE server configuration
{
"mcpServers": {
"yepcode-mcp-server": {
"url": "https://cloud.yepcode.io/mcp/sk-c2E....RD/sse?mcpOptions=disableRunCodeTool,runCodeCleanup"
}
}
}
// NPX configuration
{
"mcpServers": {
"yepcode-mcp-server": {
"command": "npx",
"args": ["-y", "@yepcode/mcp-server"],
"env": {
"YEPCODE_API_TOKEN": "your_api_token_here",
"YEPCODE_MCP_OPTIONS": "disableRunCodeTool,runCodeCleanup"
}
}
}
}
Sets an environment variable in the YepCode workspace.
// Input
{
key: string; // Variable name
value: string; // Variable value
isSensitive?: boolean; // Whether to mask the value in logs (default: true)
}
Removes an environment variable from the YepCode workspace.
// Input
{
key: string; // Name of the variable to remove
}
YepCode provides a built-in storage system that allows you to upload, list, download, and delete files. These files can be accessed from your code executions using the yepcode.storage
helper methods.
Lists all files in your YepCode storage.
// Input
{
prefix?: string; // Optional prefix to filter files
}
// Response
{
files: Array<{
filename: string; // File name or path
size: number; // File size in bytes
lastModified: string; // Last modification date
}>;
}
Uploads a file to YepCode storage.
// Input
{
filename: string; // File path (e.g., 'file.txt' or 'folder/file.txt')
content: string | { // File content
data: string; // Base64 encoded content for binary files
encoding: "base64";
};
}
// Response
{
success: boolean; // Upload success status
filename: string; // Uploaded file path
}
Downloads a file from YepCode storage.
// Input
{
filename: string; // File path to download
}
// Response
{
filename: string; // File path
content: string; // File content (base64 for binary files)
encoding?: string; // Encoding type if binary
}
Deletes a file from YepCode storage.
// Input
{
filename: string; // File path to delete
}
// Response
{
success: boolean; // Deletion success status
filename: string; // Deleted file path
}
The MCP server can expose your YepCode Processes as individual MCP tools, making them directly accessible to AI assistants. This feature is enabled by just adding the mcp-tool
tag to your process (see our docs to learn more about process tags).
There will be a tool for each exposed process: run_ycp_<process_slug>
(or run_ycp_<process_id>
if tool name is longer than 60 characters).
// Input
{
parameters?: any; // This should match the input parameters specified in the process
options?: {
tag?: string; // Process version to execute
comment?: string; // Execution context
};
synchronousExecution?: boolean; // Whether to wait for completion (default: true)
}
// Response (synchronous execution)
{
executionId: string; // Unique execution identifier
logs: string[]; // Process execution logs
returnValue?: unknown; // Process output
error?: string; // Error message if execution failed
}
// Response (asynchronous execution)
{
executionId: string; // Unique execution identifier
}
Retrieves the result of a process execution.
// Input
{
executionId: string; // ID of the execution to retrieve
}
// Response
{
executionId: string; // Unique execution identifier
logs: string[]; // Process execution logs
returnValue?: unknown; // Process output
error?: string; // Error message if execution failed
}
This project is licensed under the MIT License - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "yepcode-mcp-server": { "command": "npx", "args": [ "-y", "@yepcode/mcp-server" ], "env": { "YEPCODE_API_TOKEN": "<YOUR_API_TOKEN>" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.