by horizondatawave
hdw-mcp-server is a Model Context Protocol (MCP) server that provides comprehensive access to LinkedIn data and functionalities. It enables both data retrieval and robust management of user accounts on LinkedIn.
hdw-mcp-server is a Model Context Protocol (MCP) server that provides comprehensive access to LinkedIn data and functionalities using the HorizonDataWave API. It enables both data retrieval and robust management of user accounts on LinkedIn.
To use hdw-mcp-server, you can install it via Smithery for Claude Desktop, or clone the repository and install dependencies manually. You will need to obtain API credentials (HDW_ACCESS_TOKEN and HDW_ACCOUNT_ID) from app.horizondatawave.ai and configure them in a .env
file. The server can then be integrated with various MCP clients like Claude Desktop, Cursor, and Windsurf by updating their respective configuration files with the server's command and environment variables.
hdw-mcp-server can be used for various purposes, including:
Q: How do I get API credentials for HorizonDataWave? A: You can register at app.horizondatawave.ai to obtain your HDW_ACCESS_TOKEN and HDW_ACCOUNT_ID.
Q: Which MCP clients are supported? A: The server provides configuration examples for Claude Desktop, Cursor, and Windsurf, and can be integrated with other MCP clients.
Q: Can I manage LinkedIn accounts using this server? A: Yes, the server offers robust account management features including chat functionality, connection management, and post commenting.
Q: What kind of LinkedIn data can I retrieve? A: You can retrieve user profiles, posts, reactions, comments, company details, and employee information.
LinkedIn Users Search: Filter and search for LinkedIn users by keywords, name, title, company, location, industry, and education.
Profile Lookup: Retrieve detailed profile information for a LinkedIn user.
Email Lookup: Find LinkedIn user details by email address.
Posts & Reactions: Retrieve a user's posts and associated reactions.
Post Reposts & Comments: Retrieve reposts and comments for a specific LinkedIn post.
Account Management:
Company Search & Details:
Google Search
HDW MCP Server exposes several tools through the MCP protocol. Each tool is defined with its name, description, and input parameters:
Search LinkedIn Users
Name: search_linkedin_users
Description: Search for LinkedIn users with various filters.
Parameters:
keywords
(optional): Any keyword for search.first_name
, last_name
, title
, company_keywords
, school_keywords
(optional).current_company
, past_company
, location
, industry
, education
(optional).count
(optional, default: 10): Maximum number of results (max 1000).timeout
(optional, default: 300): Timeout in seconds (20–1500).Get LinkedIn Profile
Name: get_linkedin_profile
Description: Retrieve detailed profile information about a LinkedIn user.
Parameters:
user
(required): User alias, URL, or URN.with_experience
, with_education
, with_skills
(optional, default: true).Get LinkedIn Email User
Name: get_linkedin_email_user
Description: Look up LinkedIn user details by email.
Parameters:
email
(required): Email address.count
(optional, default: 5).timeout
(optional, default: 300).Get LinkedIn User Posts
Name: get_linkedin_user_posts
Description: Retrieve posts for a LinkedIn user by URN.
Parameters:
urn
(required): User URN (must include prefix, e.g. fsd_profile:...
).count
(optional, default: 10).timeout
(optional, default: 300).Get LinkedIn User Reactions
Name: get_linkedin_user_reactions
Description: Retrieve reactions for a LinkedIn user by URN.
Parameters:
urn
(required).count
(optional, default: 10).timeout
(optional, default: 300).Get LinkedIn Chat Messages
Name: get_linkedin_chat_messages
Description: Retrieve top chat messages from the LinkedIn management API.
Parameters:
user
(required): User URN (with prefix).count
(optional, default: 20).timeout
(optional, default: 300).Send LinkedIn Chat Message
Name: send_linkedin_chat_message
Description: Send a chat message using the LinkedIn management API.
Parameters:
user
(required): Recipient user URN (with prefix).text
(required): Message text.timeout
(optional, default: 300).Send LinkedIn Connection Request
Name: send_linkedin_connection
Description: Send a connection invitation to a LinkedIn user.
Parameters:
user
(required).timeout
(optional, default: 300).Send LinkedIn Post Comment
Name: send_linkedin_post_comment
Description: Create a comment on a LinkedIn post or reply.
Parameters:
text
(required): Comment text.urn
(required): Activity or comment URN.timeout
(optional, default: 300).Get LinkedIn User Connections
Name: get_linkedin_user_connections
Description: Retrieve a list of LinkedIn user connections.
Parameters:
connected_after
(optional): Timestamp filter.count
(optional, default: 20).timeout
(optional, default: 300).Get LinkedIn Post Reposts
Name: get_linkedin_post_reposts
Description: Retrieve reposts for a LinkedIn post.
Parameters:
urn
(required): Post URN (must start with activity:
).count
(optional, default: 10).timeout
(optional, default: 300).Get LinkedIn Post Comments
Name: get_linkedin_post_comments
Description: Retrieve comments for a LinkedIn post.
Parameters:
urn
(required).sort
(optional, default: "relevance"
; allowed values: "relevance"
, "recent"
).count
(optional, default: 10).timeout
(optional, default: 300).Get LinkedIn Google Company
Name: get_linkedin_google_company
Description: Search for LinkedIn companies via Google – the first result is typically the best match.
Parameters:
keywords
(required): Array of company keywords.with_urn
(optional, default: false).count_per_keyword
(optional, default: 1; range 1–10).timeout
(optional, default: 300).Get LinkedIn Company
Name: get_linkedin_company
Description: Retrieve detailed information about a LinkedIn company.
Parameters:
company
(required): Company alias, URL, or URN.timeout
(optional, default: 300).Get LinkedIn Company Employees
Name: get_linkedin_company_employees
Description: Retrieve employees of a LinkedIn company.
Parameters:
companies
(required): Array of company URNs.keywords
, first_name
, last_name
(optional).count
(optional, default: 10).timeout
(optional, default: 300).To install HDW MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @horizondatawave/hdw-mcp-server --client claude
Open your terminal and run the following commands:
# Clone the repository
git clone https://github.com/horizondatawave/hdw-mcp-server.git
# Change directory to the project folder
cd hdw-mcp-server
# Install dependencies
npm install
Register at app.horizondatawave.ai to get your API key and 100 free credits. You will receive your HDW_ACCESS_TOKEN and HDW_ACCOUNT_ID.
Create a .env
file in the root of your project with the following content:
HDW_ACCESS_TOKEN=YOUR_HD_W_ACCESS_TOKEN
HDW_ACCOUNT_ID=YOUR_HD_W_ACCOUNT_ID
Update your Claude configuration file (claude_desktop_config.json
) with the following content:
{
"mcpServers": {
"hdw": {
"command": "npx",
"args": ["-y","@horizondatawave/mcp"],
"env": {
"HDW_ACCESS_TOKEN": "YOUR_HD_W_ACCESS_TOKEN",
"HDW_ACCOUNT_ID": "YOUR_HD_W_ACCOUNT_ID"
}
}
}
}
Configuration file location:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
Easy way:
Open Cursor Settings and add a new MCP server with the command:
env HDW_ACCESS_TOKEN=your-access-token HDW_ACCOUNT_ID=your-account-id node /path/to/your/build/index.js
Safe way:
Copy the provided template run.template.sh
to a new file (e.g. run.sh
), update it with your credentials, and configure Cursor to run:
sh /path/to/your/run.sh
Update your Windsurf configuration file (mcp_config.json
) with the following content:
{
"mcpServers": {
"hdw": {
"command": "node",
"args": ["/path/to/your/build/index.js"],
"env": {
"HDW_ACCESS_TOKEN": "YOUR_HD_W_ACCESS_TOKEN",
"HDW_ACCOUNT_ID": "YOUR_HD_W_ACCOUNT_ID"
}
}
}
}
Note: After configuration, you can disable official web tools to conserve your API credits.
Below is an example configuration for an MCP client (e.g., a custom integration):
{
"mcpServers": {
"hdw": {
"command": "npx",
"args": ["-y","@horizondatawave/mcp"],
"env": {
"HDW_ACCESS_TOKEN": "YOUR_HD_W_ACCESS_TOKEN",
"HDW_ACCOUNT_ID": "YOUR_HD_W_ACCOUNT_ID"
}
}
}
}
Replace the paths and credentials with your own values.
This project is licensed under the MIT License.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by firecrawl
Adds powerful web scraping, crawling, and search capabilities to LLM clients through a Model Context Protocol (MCP) server.
by mendableai
Firecrawl MCP Server is an official Model Context Protocol (MCP) server implementation that integrates with Firecrawl to provide powerful web scraping capabilities to Large Language Models (LLMs). It acts as a bridge between LLMs and the web, allowing them to access and process web content for various tasks.
by tavily-ai
Provides real-time web search, intelligent data extraction, site mapping, and crawling capabilities via MCP tools.
by iFurySt
RedNote-MCP is an MCP server designed to access content from RedNote (XiaoHongShu, xhs), a popular Chinese social media and e-commerce platform. It enables programmatic interaction with RedNote for data retrieval and automation.
by zcaceres
fetch-mcp is a flexible HTTP fetching server designed to retrieve web content in various formats. It acts as a server that can fetch HTML, JSON, Markdown, or plaintext from specified URLs, enabling on-demand fetching and transformation of web content.
by apify
An MCP server for Apify Actors, allowing AI assistants to use any of the 3,000+ pre-built cloud tools for web scraping and automation.
by openbnb-org
The mcp-server-airbnb is an MCP (Multi-Cloud Platform) server designed to interact with Airbnb. It provides tools for searching Airbnb listings and retrieving detailed information about specific listings.
by cnych
A free SEO tool MCP (Model Control Protocol) service based on Ahrefs data, offering features like backlink analysis, keyword research, and traffic estimation.
by tinyfish-io
AgentQL MCP Server is a Model Context Protocol (MCP) server that integrates AgentQL's data extraction capabilities, enabling AI agents to get structured data from the unstructured web.