by oatpp
Implements the Model Context Protocol for Oat++, enabling automatic generation of prompts, resources, and tools so LLMs can query and interact with C++ APIs.
Oatpp Mcp adds Model Context Protocol support to the Oat++ web‑framework. It lets developers expose prompts, resources, and tool definitions directly from their Oat++ ApiController, allowing large language models to discover and invoke API functionality automatically.
mkdir build && cd build
cmake ..
make install
oatpp::mcp::Server instance in your application.addPrompt, addResource, and addTool.server.stdioListen(); (redirect Oat++ logging to another stream).server.getSseController() to an HTTP router.ApiController definitions.CodeReview).File resource.Q: Do I need to write any protocol handling code? A: No. The library handles serialization, transport, and the MCP spec; you only provide prompts, resources, and tools.
Q: Can I use a custom logger? A: Yes. Redirect Oat++ logging to a file or your own logger before starting the server.
Q: Is there an example project?
A: See the example‑crud branch of the Oat++ example repository and the test ServerTest.cpp under test/oatpp-mcp/app/.
Q: Which Oat++ version is required? A: The library depends on the main Oat++ module; any recent stable release works.
Anthropic’s Model Context Protocol implementation for Oat++
Read more:
:tada: oatpp-mcp can automatically generate tools from ApiController so that you can query your API with LLM. :tada:
add_mcp_server)mkdir build && cd build
cmake ..
make install
Find working example in tests /test/oatpp-mcp/app/ServerTest.cpp
Note: make sure to redirect oatpp logging to a different stream - ex.: to file by providing custom Logger.
/* Create MCP server */
oatpp::mcp::Server server;
/* Add prompts */
server.addPrompt(std::make_shared<prompts::CodeReview>());
/* Add resource */
server.addResource(std::make_shared<resource::File>());
/* Add tools */
server.addTool(std::make_shared<tools::Logger>());
/* Run server */
server.stdioListen();
/* Create MCP server */
oatpp::mcp::Server server;
/* Add prompts */
server.addPrompt(std::make_shared<prompts::CodeReview>());
/* Add resource */
server.addResource(std::make_shared<resource::File>());
/* Add tools */
server.addTool(std::make_shared<tools::Logger>());
/* Add SSE controller to your HTTP server router */
router->addController(server.getSseController());
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by zed-industries
Provides real-time collaborative editing powered by Rust, enabling developers to edit code instantly across machines with a responsive, GPU-accelerated UI.
by cline
Provides autonomous coding assistance directly in the IDE, enabling file creation, editing, terminal command execution, browser interactions, and tool extension with user approval at each step.
by continuedev
Provides continuous AI assistance across IDEs, terminals, and CI pipelines, offering agents, chat, inline editing, and autocomplete to accelerate software development.
by github
Enables AI agents, assistants, and chatbots to interact with GitHub via natural‑language commands, providing read‑write access to repositories, issues, pull requests, workflows, security data and team activity.
by block
Automates engineering tasks by installing, executing, editing, and testing code using any large language model, providing end‑to‑end project building, debugging, workflow orchestration, and external API interaction.
by RooCodeInc
An autonomous coding agent that lives inside VS Code, capable of generating, refactoring, debugging code, managing files, running terminal commands, controlling a browser, and adapting its behavior through custom modes and instructions.
by lastmile-ai
A lightweight, composable framework for building AI agents using Model Context Protocol and simple workflow patterns.
by firebase
Provides a command‑line interface to manage, test, and deploy Firebase projects, covering hosting, databases, authentication, cloud functions, extensions, and CI/CD workflows.
by gptme
Empowers large language models to act as personal AI assistants directly inside the terminal, providing capabilities such as code execution, file manipulation, web browsing, vision, and interactive tool usage.