by kiwamizamurai
MCP server implementation for Kibela API integration, enabling LLMs to interact with Kibela content.
mcp-kibela-server is a Model Context Protocol (MCP) server that facilitates the integration of Large Language Models (LLMs) with Kibela, a collaborative note-taking and knowledge-sharing platform. It acts as a bridge, allowing LLMs to access and interact with content stored in Kibela.
To use mcp-kibela-server, you need to configure it with your Kibela team name and API token. This can be done via environment variables (KIBELA_TEAM
and KIBELA_TOKEN
). The server can be run directly from the source using npx
or as a Docker container. For integration with tools like Cursor, you can add a configuration to your ~/.cursor/mcp.json
file, specifying the command to run the server (e.g., npx @kiwamizamurai/mcp-kibela-server
or a Docker command).
Q: What environment variables are required to run mcp-kibela-server?
A: You need to set KIBELA_TEAM
(your Kibela team name) and KIBELA_TOKEN
(your Kibela API token).
Q: Can I run mcp-kibela-server using Docker?
A: Yes, you can build and run the server as a Docker container. Example Docker commands and mcp.json
configurations are provided in the README.
Q: How do I integrate mcp-kibela-server with Cursor?
A: You can add a configuration to your ~/.cursor/mcp.json
file, specifying the command to run the server, either via npx
or Docker.
Q: What kind of information can I retrieve about notes? A: You can get note ID, title, URL, author, groups, full HTML content, comments, attachments, and more, depending on the specific tool used.
Q: Can I manage likes on notes using this server? A: Yes, there are tools available to like and unlike notes, which will update the likers list.
MCP server implementation for Kibela API integration, enabling LLMs to interact with Kibela content.
KIBELA_TEAM
: Your Kibela team name (required)KIBELA_TOKEN
: Your Kibela API token (required)Add to your ~/.cursor/mcp.json
:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["-y", "@kiwamizamurai/mcp-kibela-server"],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
If you want to use docker instead
{
"mcpServers": {
"kibela": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"KIBELA_TEAM",
"-e",
"KIBELA_TOKEN",
"ghcr.io/kiwamizamurai/mcp-kibela-server:latest"
],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
Search Kibela notes with given query
query
(string): Search querycoediting
(boolean, optional): Filter by co-editing statusisArchived
(boolean, optional): Filter by archive statussortBy
(string, optional): Sort by (RELEVANT, CONTENT_UPDATED_AT)userIds
(string[], optional): Filter by user IDsfolderIds
(string[], optional): Filter by folder IDsGet your latest notes from Kibela
limit
(number, optional): Number of notes to fetch (default: 15)Get content and comments of a specific note
id
(string): Note IDinclude_image_data
(boolean, optional): Whether to include image data URLs in the response (default: false)Get list of accessible groups
Get folders in a group
groupId
(string): Group IDparentFolderId
(string, optional): Parent folder ID for nested foldersGet notes in a group that are not attached to any folder
groupId
(string): Group IDGet notes in a folder
folderId
(string): Folder IDlimit
(number, optional): Number of notes to fetch (default: 100)Get list of users
Like a note
noteId
(string): Note IDUnlike a note
noteId
(string): Note IDGet your recently viewed notes
limit
(number, optional): Number of notes to fetch (max 15)Get note content by its path or URL
path
(string): Note path (e.g. '/group/folder/note') or full Kibela URL (e.g. 'https://team.kibe.la/notes/123')include_image_data
(boolean, optional): Whether to include image data URLs in the response (default: false)npm install
For local development, update your ~/.cursor/mcp.json
:
{
"mcpServers": {
"kibela": {
"command": "node",
"args": ["path/to/mcp-kibela-server/dist/src/index.js"],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
npx @modelcontextprotocol/inspector node ./dist/src/index.js
and set environemtns
Build and run locally:
docker build -t mcp-kibela-server .
Then use this configuration:
{
"mcpServers": {
"kibela": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"KIBELA_TEAM",
"-e",
"KIBELA_TOKEN",
"mcp-kibela-server"
],
"env": {
"KIBELA_TEAM": "YOUR_TEAM_NAME",
"KIBELA_TOKEN": "YOUR_TOKEN"
}
}
}
}
For SSE transport, ensure the server URL is set to: http://localhost:3000/sse
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.