rising/deepseek-thinker-mcp
Built by Metorial, the integration platform for agentic AI.
rising/deepseek-thinker-mcp
Server Summary
OpenAI API mode support
Ollama local mode support
Focused reasoning output
Access to Deepseek's reasoning process
A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
🤖 Dual Mode Support
🎯 Focused Reasoning
originPrompt
(string): User's original promptSet the following environment variables:
API_KEY=
BASE_URL=
Set the following environment variable:
USE_OLLAMA=true
Add the following configuration to your claude_desktop_config.json
:
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "",
"BASE_URL": ""
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"USE_OLLAMA": "true"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "node",
"args": [
"/your-path/deepseek-thinker-mcp/build/index.js"
],
"env": {
"API_KEY": "",
"BASE_URL": ""
}
}
}
}
# Install dependencies
npm install
# Build project
npm run build
# Run service
node build/index.js
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
This project is licensed under the MIT License. See the LICENSE file for details.