by pyroprompts
any-chat-completions-mcp is a Model Context Protocol (MCP) server that acts as a bridge, enabling applications to integrate with various OpenAI SDK Compatible Chat Completions APIs. It allows applications like Claude Desktop and LibreChat to utilize different large language models (LLMs) through a unified interface.
any-chat-completions-mcp is a Model Context Protocol (MCP) server that enables integration with various OpenAI SDK Compatible Chat Completions APIs, such as OpenAI, Perplexity, Groq, and xAI. It acts as a bridge, allowing applications like Claude Desktop and LibreChat to utilize different large language models (LLMs) through a unified interface.
To use any-chat-completions-mcp, you can either install it via npx
or clone the repository and build it locally. The server is configured by modifying the claude_desktop_config.json
file (for Claude Desktop) or similar configuration for other applications like LibreChat. You define mcpServers
entries, specifying the command to run the server (either npx
or node
with the build path) and environment variables for AI_CHAT_KEY
, AI_CHAT_NAME
, AI_CHAT_MODEL
, and AI_CHAT_BASE_URL
to connect to different LLM providers. Multiple providers can be configured by adding separate entries.
chat
tool to relay questions to configured AI chat providers.Q: How do I debug the MCP server?
A: Since MCP servers communicate over stdio, debugging can be challenging. It is recommended to use the MCP Inspector, which can be accessed by running npm run inspector
.
Q: Can I use multiple LLM providers with this server?
A: Yes, you can configure multiple LLM providers by adding separate entries in your mcpServers
configuration, each with different environment variables for the desired provider.
Integrate Claude with Any OpenAI SDK Compatible Chat Completion API - OpenAI, Perplexity, Groq, xAI, PyroPrompts and more.
This implements the Model Context Protocol Server. Learn more: https://modelcontextprotocol.io
This is a TypeScript-based MCP server that implements an implementation into any OpenAI SDK Compatible Chat Completions API.
It has one tool, chat
which relays a question to a configured AI Chat Provider.
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
To add OpenAI to Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
You can use it via npx
in your Claude Desktop configuration like this:
{
"mcpServers": {
"chat-openai": {
"command": "npx",
"args": [
"@pyroprompts/any-chat-completions-mcp"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:
{
"mcpServers": {
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:
{
"mcpServers": {
"chat-pyroprompts": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "PYROPROMPTS_KEY",
"AI_CHAT_NAME": "PyroPrompts",
"AI_CHAT_MODEL": "ash",
"AI_CHAT_BASE_URL": "https://api.pyroprompts.com/openaiv1"
}
},
"chat-perplexity": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "PERPLEXITY_KEY",
"AI_CHAT_NAME": "Perplexity",
"AI_CHAT_MODEL": "sonar",
"AI_CHAT_BASE_URL": "https://api.perplexity.ai"
}
},
"chat-openai": {
"command": "node",
"args": [
"/path/to/any-chat-completions-mcp/build/index.js"
],
"env": {
"AI_CHAT_KEY": "OPENAI_KEY",
"AI_CHAT_NAME": "OpenAI",
"AI_CHAT_MODEL": "gpt-4o",
"AI_CHAT_BASE_URL": "https://api.openai.com/v1"
}
}
}
}
With these three, you'll see a tool for each in the Claude Desktop Home:
And then you can chat with other LLMs and it shows in chat like this:
Or, configure in LibreChat like:
chat-perplexity:
type: stdio
command: npx
args:
- -y
- @pyroprompts/any-chat-completions-mcp
env:
AI_CHAT_KEY: "pplx-012345679"
AI_CHAT_NAME: Perplexity
AI_CHAT_MODEL: sonar
AI_CHAT_BASE_URL: "https://api.perplexity.ai"
PATH: '/usr/local/bin:/usr/bin:/bin'
And it shows in LibreChat:
To install Any OpenAI Compatible API Integrations for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install any-chat-completions-mcp-server --client claude
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
CLAUDEANYCHAT
for 20 free automation credits on Pyroprompts.Reviews feature coming soon
Stay tuned for community discussions and feedback