by 66julienmartin
MCP-server-Qwen_Max is a Model Context Protocol (MCP) server implementation for the Qwen Max language model, enabling integration with applications like Claude Desktop.
MCP-server-Qwen_Max is an MCP (Model Context Protocol) server specifically designed for the Qwen series of large language models (Qwen-Max, Qwen-Plus, Qwen-Turbo). It allows applications that support the MCP protocol, such as Claude Desktop, to interact with and utilize the capabilities of Qwen models for text generation and other AI tasks.
There are two primary ways to install and use MCP-server-Qwen_Max:
npx -y @smithery/cli install @66julienmartin/mcp-server-qwen_max --client claude
git clone https://github.com/66julienmartin/mcp-server-qwen-max.git
cd Qwen_Max
npm install
.env file in the project root and add your Dashscope API key:
DASHSCOPE_API_KEY=your-api-key-here
{
"mcpServers": {
"qwen_max": {
"command": "node",
"args": ["/path/to/Qwen_Max/build/index.js"],
"env": {
"DASHSCOPE_API_KEY": "your-api-key-here"
}
}
}
}
By default, the server uses the qwen-max model. You can change the model by modifying the model parameter in src/index.ts to qwen-plus or qwen-turbo.
max_tokens and temperature for fine-grained control over text generation.Q: Why is Node.js used for this implementation? A: Node.js/TypeScript provides stable and reliable integration with MCP servers, offering better type safety, error handling, and compatibility with Claude Desktop compared to other languages.
Q: What are the prerequisites for running MCP-server-Qwen_Max? A: You need Node.js (v18 or higher), npm, Claude Desktop, and a Dashscope API key.
Q: How can I change the Qwen model used by the server?
A: You can modify the model parameter in src/index.ts to qwen-max, qwen-plus, or qwen-turbo.
Q: What does the temperature parameter do?
A: The temperature parameter controls the randomness of the model's output. Lower values (0.0-0.7) result in more focused output, while higher values (0.7-1.0) lead to more creative and varied output.
Q: Where can I find more information about Qwen models? A: You can visit the Alibaba Cloud Model Documentation: https://www.alibabacloud.com/help/en/model-studio/getting-started/models?spm=a3c0i.23458820.2359477120.1.446c7d3f9LT0FY.
A Model Context Protocol (MCP) server implementation for the Qwen Max language model.
Why Node.js? This implementation uses Node.js/TypeScript as it currently provides the most stable and reliable integration with MCP servers compared to other languages like Python. The Node.js SDK for MCP offers better type safety, error handling, and compatibility with Claude Desktop.
To install Qwen Max MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @66julienmartin/mcp-server-qwen_max --client claude
git clone https://github.com/66julienmartin/mcp-server-qwen-max.git
cd Qwen_Max
npm install
By default, this server uses the Qwen-Max model. The Qwen series offers several commercial models with different capabilities:
Provides the best inference performance, especially for complex and multi-step tasks.
Context window: 32,768 tokens
Available versions:
Balanced combination of performance, speed, and cost, ideal for moderately complex tasks.
Context window: 131,072 tokens
Available versions:
Fast speed and low cost, suitable for simple tasks.
Available versions:
To modify the model, update the model name in src/index.ts:
// For Qwen-Max (default)
model: "qwen-max"
// For Qwen-Plus
model: "qwen-plus"
// For Qwen-Turbo
model: "qwen-turbo"
For more detailed information about available models, visit the Alibaba Cloud Model Documentation https://www.alibabacloud.com/help/en/model-studio/getting-started/models?spm=a3c0i.23458820.2359477120.1.446c7d3f9LT0FY.
qwen-max-mcp/
├── src/
│ ├── index.ts # Main server implementation
├── build/ # Compiled files
│ ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json
.env file in the project root:DASHSCOPE_API_KEY=your-api-key-here
{
"mcpServers": {
"qwen_max": {
"command": "node",
"args": ["/path/to/Qwen_Max/build/index.js"],
"env": {
"DASHSCOPE_API_KEY": "your-api-key-here"
}
}
}
}
npm run dev # Watch mode
npm run build # Build
npm run start # Start server
// Example tool call
{
"name": "qwen_max",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192,
"temperature": 0.7
}
}
The temperature parameter controls the randomness of the model's output:
Lower values (0.0-0.7): More focused and deterministic outputs Higher values (0.7-1.0): More creative and varied outputs
Recommended temperature settings by task:
Code generation: 0.0-0.3 Technical writing: 0.3-0.5 General tasks: 0.7 (default) Creative writing: 0.8-1.0
The server provides detailed error messages for common issues:
API authentication errors Invalid parameters Rate limiting Network issues Token limit exceeded Model availability issues
Contributions are welcome! Please feel free to submit a Pull Request.
MIT
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.