by amornpan
A Model Context Protocol server implementation in Python that provides access to LINE Bot messages, enabling Language Models to read and analyze LINE conversations through a standardized interface.
py-mcp-line is a Python-based Model Context Protocol (MCP) server designed to integrate with LINE Bot. It allows Language Models (LMs) to access and analyze LINE conversations by providing a standardized interface to LINE messages. This project aims to bridge the gap between LINE communication and AI-driven language processing.
To use py-mcp-line, you need to first clone the repository and install the required dependencies:
git clone https://github.com/amornpan/py-mcp-line.git
cd py-mcp-line
pip install -r requirements.txt
Next, configure your environment by creating a .env file with your LINE Channel Secret, Access Token, server port, and messages file path:
LINE_CHANNEL_SECRET=your_channel_secret
LINE_ACCESS_TOKEN=your_access_token
SERVER_PORT=8000
MESSAGES_FILE=data/messages.json
Finally, you can integrate it with applications like Claude Desktop by adding the server configuration to your Claude Desktop configuration file, allowing Claude to interact with your LINE messages.
asyncio for efficient handling of concurrent operations.python-dotenv.Q: What is the Model Context Protocol (MCP)? A: The Model Context Protocol is a standardized interface that allows Language Models to interact with and analyze various forms of contextual data, in this case, LINE conversations.
Q: What Python versions are supported? A: py-mcp-line requires Python 3.8 or higher.
Q: How do I configure the LINE API credentials?
A: You need to create a .env file in the project root and provide your LINE_CHANNEL_SECRET and LINE_ACCESS_TOKEN obtained from the LINE Developers console.
Q: Can I filter messages when reading resources?
A: Yes, the read_resource API supports filtering by date, user, or content.
Q: What kind of messages does it support? A: It supports text, sticker, and image messages from LINE.
A Model Context Protocol server implementation in Python that provides access to LINE Bot messages. This server enables Language Models to read and analyze LINE conversations through a standardized interface.
asynciopython-dotenvgit clone https://github.com/amornpan/py-mcp-line.git
cd py-mcp-line
pip install -r requirements.txt
PY-MCP-LINE/
├── src/
│ └── line/
│ ├── __init__.py
│ └── server.py
├── data/
│ └── messages.json
├── tests/
│ ├── __init__.py
│ └── test_line.py
├── .env
├── .env.example
├── .gitignore
├── README.md
├── Dockerfile
└── requirements.txt
src/line/ - Main source code directory
__init__.py - Package initializationserver.py - Main server implementationdata/ - Data storage directory
messages.json - Stored LINE messagestests/ - Test files directory
__init__.py - Test package initializationtest_line.py - LINE functionality tests.env - Environment configuration file (not in git).env.example - Example environment configuration.gitignore - Git ignore rulesREADME.md - Project documentationDockerfile - Docker configurationrequirements.txt - Project dependenciesCreate a .env file in the project root:
LINE_CHANNEL_SECRET=your_channel_secret
LINE_ACCESS_TOKEN=your_access_token
SERVER_PORT=8000
MESSAGES_FILE=data/messages.json
@app.list_resources()
async def list_resources() -> list[Resource]
line://<message_type>/data@app.read_resource()
async def read_resource(uri: AnyUrl) -> str
line://<message_type>/dataAdd to your Claude Desktop configuration:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"line": {
"command": "python",
"args": [
"server.py"
],
"env": {
"LINE_CHANNEL_SECRET": "your_channel_secret",
"LINE_ACCESS_TOKEN": "your_access_token",
"SERVER_PORT": "8000",
"MESSAGES_FILE": "data/messages.json"
}
}
}
}
The server implements comprehensive error handling for:
All errors are logged and returned with appropriate error messages.
Feel free to reach out to me if you have any questions about this project or would like to collaborate!
Made with ❤️ by Amornpan Phornchaicharoen
Amornpan Phornchaicharoen
Create a requirements.txt file with:
fastapi>=0.104.1
pydantic>=2.10.6
uvicorn>=0.34.0
python-dotenv>=1.0.1
line-bot-sdk>=3.5.0
anyio>=4.5.0
mcp==1.2.0
These versions have been tested and verified to work together. The key components are:
fastapi and uvicorn for the API serverpydantic for data validationline-bot-sdk for LINE Bot integrationmcp for Model Context Protocol implementationpython-dotenv for environment configurationanyio for asynchronous I/O supportPlease log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.