by ruixingshi
Provides Deepseek model reasoning content to MCP-enabled AI clients, supporting both OpenAI API and local Ollama deployments.
Delivers the thought process and reasoning outputs of the Deepseek model to any MCP‑enabled AI client such as Claude Desktop. It can operate against the official Deepseek API or a locally hosted Ollama instance.
API_KEY
(your Deepseek/OpenAI key) and BASE_URL
(API endpoint).USE_OLLAMA=true
.claude_desktop_config.json
(or equivalent) using the provided command
, args
, and env
sections.npx -y deepseek-thinker-mcp
) and invoke the get-deepseek-thinker
tool from the client.npx
and straightforward JSON configuration for AI clients.API_KEY
and BASE_URL
?
For OpenAI API mode both are required; BASE_URL
points to the Deepseek endpoint. In Ollama mode only USE_OLLAMA=true
is needed.npx -y deepseek-thinker-mcp
as shown in the configuration examples.get-deepseek-thinker
tool expect?
A single string parameter originPrompt
containing the user’s original query.A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
🤖 Dual Mode Support
🎯 Focused Reasoning
originPrompt
(string): User's original promptSet the following environment variables:
API_KEY=<Your OpenAI API Key>
BASE_URL=<API Base URL>
Set the following environment variable:
USE_OLLAMA=true
Add the following configuration to your claude_desktop_config.json
:
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"USE_OLLAMA": "true"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "node",
"args": [
"/your-path/deepseek-thinker-mcp/build/index.js"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
# Install dependencies
npm install
# Build project
npm run build
# Run service
node build/index.js
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
This project is licensed under the MIT License. See the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "deepseek-thinker": { "command": "npx", "args": [ "-y", "deepseek-thinker-mcp" ], "env": { "API_KEY": "<YOUR_API_KEY>", "BASE_URL": "<YOUR_BASE_URL>", "USE_OLLAMA": "false" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.