by Pearl-com
Provides a standardized interface for accessing Pearl's AI assistants and human experts through the Model Context Protocol, supporting both automatic and expert‑assisted interactions.
Pearl Mcp Server enables MCP‑compatible clients (e.g., Claude Desktop, Cursor) to communicate with Pearl's AI models and vetted human experts. It routes queries to AI‑only, AI‑assisted expert, or direct expert modes, maintains session state, and returns conversation history.
git clone https://github.com/Pearl-com/pearl_mcp_server.git
cd pearl_mcp_server
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e .
.env
file in src
with your Pearl API key:
PEARL_API_KEY=your-api-key-here
pearl-mcp-server --api-key your-api-key
pearl-mcp-server --api-key your-api-key --transport sse --port 8000
https://mcp.pearl.com/mcp
.Q: Do I need to host the server locally?
A: No. You can either run the Python server yourself or use Pearl's hosted endpoint at https://mcp.pearl.com/mcp
.
Q: Which Python version is required? A: Python 3.12 or newer.
Q: How is authentication handled?
A: Supply your Pearl API key via the --api-key
flag or the PEARL_API_KEY
environment variable.
Q: Can I switch transport without reinstalling?
A: Yes. Use the --transport
flag (stdio
or sse
) and optionally specify a custom --port
for SSE.
Q: How does the server choose an expert category? A: Pearl’s backend analyses the query context and automatically selects the most relevant expert domain.
Q: What tools are exposed to the client?
A: ask_pearl_ai
, ask_pearl_expert
, ask_expert
, get_conversation_status
, and get_conversation_history
.
A Model Context Protocol (MCP) server implementation that exposes Pearl's AI and Expert services through a standardized interface. This server allows MCP clients like Claude Desktop, Cursor, and other MCP-compatible applications to interact with Pearl's advanced AI assistants and human experts.
git clone https://github.com/Pearl-com/pearl_mcp_server.git
cd pearl_mcp_server
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -e .
.env
file in the src directory:PEARL_API_KEY=your-api-key-here
Start the server using either stdio (default) or SSE transport:
# Using stdio transport (default)
pearl-mcp-server --api-key your-api-key
# Using SSE transport on custom port
pearl-mcp-server --api-key your-api-key --transport sse --port 8000
Pearl provides a hosted MCP server at:
https://mcp.pearl.com/mcp
This can be used directly with any MCP client without installing the Python application locally.
The server provides the following tools:
ask_pearl_ai
question
: The user's querychat_history
(optional): Previous conversation contextsession_id
(optional): For continuing conversationsask_pearl_expert
ask_expert
get_conversation_status
session_id
get_conversation_history
session_id
Pearl's MCP server provides access to a wide range of expert categories. The appropriate expert category is automatically determined by Pearl's API based on the context of your query, ensuring you're connected with the most relevant expert for your needs.
Here are the main categories of expertise available:
Medical & Healthcare
Legal & Financial
Technical & Professional
Education & Career
Lifestyle & Personal
Each expert category can be accessed through the ask_expert
or ask_pearl_expert
tools. You don't need to specify the category - simply describe your question or problem, and Pearl's AI will automatically route your request to the most appropriate expert type based on the context.
For connecting to a local MCP server using stdio transport, add the following configuration to your MCP client:
{
"pearl-mcp-server": {
"type": "stdio",
"command": "pearl-mcp-server",
"args": ["--api-key", "your-api-key"],
"env": {
"PEARL_API_KEY": "Your Pearl Api Key"
}
}
}
Some MCP clients don't support direct connection to remote MCP servers. For these clients, you can use the mcp-remote
package as a bridge:
Prerequisites:
Configuration for remote server:
{
"mcpServers": {
"pearl-remote": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.pearl.com/sse"
]
}
}
}
Configuration file locations:
%APPDATA%\Claude\claude_desktop_config.json
~/Library/Application Support/Claude/claude_desktop_config.json
~/.cursor/mcp.json
~/.codeium/windsurf/mcp_config.json
Additional Options:
@latest
to npx command"args": ["mcp-remote@latest", "https://mcp.pearl.com/sse"]
Troubleshooting:
rm -rf ~/.mcp-auth
Get-Content "$env:APPDATA\Claude\Logs\mcp.log" -Wait -Tail 20
tail -n 20 -F ~/Library/Logs/Claude/mcp*.log
npx mcp-remote-client https://mcp.pearl.com/sse
import asyncio
from mcp.client.session import ClientSession
from mcp.client.stdio import StdioServerParameters, stdio_client
async def main():
# For stdio transport
async with stdio_client(
StdioServerParameters(command="pearl-mcp-server", args=["--api-key", "your-api-key"])
) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# List available tools
tools = await session.list_tools()
print(tools)
# Call Pearl AI
result = await session.call_tool(
"ask_pearl_ai",
{
"question": "What is MCP?",
"session_id": "optional-session-id"
}
)
print(result)
asyncio.run(main())
To obtain a Pearl API key for using this server:
Keep your API key secure and never commit it to version control.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "pearl-mcp": { "command": "pearl-mcp-server", "args": [ "--api-key", "<YOUR_API_KEY>" ], "env": { "PEARL_API_KEY": "<YOUR_API_KEY>" } } } }
Discover more MCP servers with similar functionality and use cases
by danny-avila
Provides a customizable ChatGPT‑like web UI that integrates dozens of AI models, agents, code execution, image generation, web search, speech capabilities, and secure multi‑user authentication, all open‑source and ready for self‑hosting.
by ahujasid
BlenderMCP integrates Blender with Claude AI via the Model Context Protocol (MCP), enabling AI-driven 3D scene creation, modeling, and manipulation. This project allows users to control Blender directly through natural language prompts, streamlining the 3D design workflow.
by pydantic
Enables building production‑grade generative AI applications using Pydantic validation, offering a FastAPI‑like developer experience.
by GLips
Figma-Context-MCP is a Model Context Protocol (MCP) server that provides Figma layout information to AI coding agents. It bridges design and development by enabling AI tools to directly access and interpret Figma design data for more accurate and efficient code generation.
by mcp-use
Easily create and interact with MCP servers using custom agents, supporting any LLM with tool calling and offering multi‑server, sandboxed, and streaming capabilities.
by sonnylazuardi
This project implements a Model Context Protocol (MCP) integration between Cursor AI and Figma, allowing Cursor to communicate with Figma for reading designs and modifying them programmatically.
by lharries
WhatsApp MCP Server is a Model Context Protocol (MCP) server for WhatsApp that allows users to search, read, and send WhatsApp messages (including media) through AI models like Claude. It connects directly to your personal WhatsApp account via the WhatsApp web multi-device API and stores messages locally in a SQLite database.
by idosal
GitMCP is a free, open-source remote Model Context Protocol (MCP) server that transforms any GitHub project into a documentation hub, enabling AI tools to access up-to-date documentation and code directly from the source to eliminate "code hallucinations."
by Klavis-AI
Klavis AI provides open-source Multi-platform Control Protocol (MCP) integrations and a hosted API for AI applications. It simplifies connecting AI to various third-party services by managing secure MCP servers and authentication.