by cswkim
Discogs MCP Server is a Model Context Protocol (MCP) server that integrates with the Discogs API, enabling users to manage their music collection, perform searches, and execute various music-related operations through an MCP client.
Discogs MCP Server is a Model Context Protocol (MCP) server designed to interact with the Discogs API. It enables users to manage their music collection, perform searches, and execute various music-related operations through an MCP client like Claude Desktop.
There are several ways to use the Discogs MCP Server:
npx -y discogs-mcp-server.pnpm install) and run the server locally using Node.js.Before running, ensure you have Node.js (or Docker) and your Discogs personal access token. The server can be configured with MCP clients like Claude Desktop by adding specific JSON configurations to claude_desktop_config.json.
per_page value for Discogs API requests?
discogs.config.defaultPerPage to 5 to avoid overwhelming clients, though the Discogs API default is 50.MCP Server for the Discogs API, enabling music catalog operations, search functionality, and more.
If you just want to get started immediately using this MCP Server with the Claude desktop app and don't care about development or running the server yourself, then make sure you have Node.js installed and your Discogs personal access token ready and skip straight to the Claude configuration section. Use the NPX method from that section.
This MCP server is built using FastMCP, a typescript framework for building MCP servers. For more information about MCP and how to use MCP servers, please refer to the FastMCP documentation and the official MCP documentation.
Check out the list of available tools: TOOLS.md
per_page default is 50, which can be too much data for some clients to process effectively, so within this project a discogs.config.defaultPerPage value has been set to 5. You can request more data in your prompts, but be aware that some clients may struggle with larger responses.20.x.x, but 18.x.x should work as well)
node --version.env file in the root directory based on .env.example.env:
DISCOGS_PERSONAL_ACCESS_TOKEN: Your Discogs personal access tokenTo get your Discogs personal access token, go to your Discogs Settings > Developers page and find your token or generate a new one. DO NOT SHARE YOUR TOKEN. OAuth support will be added in a future release.
The other environment variables in .env.example are optional and have sensible defaults, so you don't need to set them unless you have specific requirements.
Install dependencies:
pnpm install
Available commands:
pnpm run dev: Start the development server with hot reloadingpnpm run dev:stream: Start the development server with hot reloading in HTTP streaming modepnpm run build: Build the production versionpnpm run start: Run the production buildpnpm run inspect: Run the MCP Inspector (see Inspection section)pnpm run format: Check code formatting (prettier)pnpm run lint: Run linter (eslint)pnpm run test: Run vitestpnpm run test:coverage: Run vitest v8 coveragepnpm run version:check: Checks that the package.json version and src/version.ts matchBuild the Docker image:
docker build -t discogs-mcp-server:latest .
Run the container:
docker run --env-file .env discogs-mcp-server:latest
For HTTP Streaming transport mode:
# The port should match what is in your .env file
docker run --env-file .env -p 3001:3001 discogs-mcp-server:latest stream
Run the MCP Inspector to test your local MCP server:
pnpm run inspect
This will start the MCP Inspector at http://127.0.0.1:6274. Visit this URL in your browser to interact with your local MCP server.
For more information about the MCP Inspector, visit the official documentation.
Currently, this MCP server has only been tested with Claude Desktop. More client examples will be added in the future.
Find your claude_desktop_config.json at Claude > Settings > Developer > Edit Config and depending on which option you'd like, add JUST ONE of the following:
Running it straight from the npm registry.
{
"mcpServers": {
"discogs": {
"command": "npx",
"args": [
"-y",
"discogs-mcp-server"
],
"env": {
"DISCOGS_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
Dependencies should have been installed before you use this method (pnpm install).
{
"mcpServers": {
"discogs": {
"command": "npx",
"args": [
"tsx",
"/PATH/TO/YOUR/PROJECT/FOLDER/src/index.ts"
],
"env": {
"DISCOGS_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
The docker image should have been built before using this method.
{
"mcpServers": {
"discogs": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--env-file",
"/PATH/TO/YOUR/PROJECT/FOLDER/.env",
"discogs-mcp-server:latest"
]
}
}
}
Any changes to local code will require Claude to be restarted to take effect. Also, Claude requires human-in-the-loop interaction to allow an MCP tool to be run, so everytime a new tool is accessed Claude will ask for permission. You usually only have to do this once per tool per chat. If using the free version, long chats may result in more frequent errors trying to run tools as Claude limits the amount of context within a single chat.
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.