by cyberchitta
Injects relevant code, documentation, and other project files into Large Language Model chat interfaces via clipboard shortcuts or direct Model Context Protocol calls, leveraging .gitignore patterns for smart file selection and rule‑based profiles for task‑specific customization.
LLM Context enables developers to quickly share selected portions of a codebase or document collection with an LLM chat, either by copying the generated context to the clipboard or by serving it through the Model Context Protocol (MCP). It automatically respects .gitignore
rules, provides rule‑based profiles for different workflows (e.g., code review, documentation), and can generate high‑level outlines of source files.
uv
tool:
uv tool install "llm-context>=0.3.0"
lc-init
.lc-sel-files
(or lc-sel-outlines
for outline generation).lc-context
. Add -p
to include prompt instructions, -u
for user notes, or -f <file>
to write to a file.lc-clip-files
with the file list to fetch and paste the contents back to the model..gitignore
patterns.lc-set-rule
) for switching between tasks such as code review, documentation generation, or custom prompts.lc-init
, lc-sel-files
, lc-context
, lc-clip-files
, etc.).lc-outlines
) and implementation extraction (lc-clip-implementations
).Q: Do I need to version‑control the .llm-context
folder?
A: Yes. Configuration files prefixed with lc-
may be overwritten on updates, so keeping them in VCS prevents loss.
Q: Can I use LLM Context with non‑Claude models? A: Absolutely. The CLI/clipboard workflow works with any chat interface; the MCP server can be called from any client that implements the Model Context Protocol.
Q: How does the tool handle large projects? A: It limits the generated context to fit within an LLM’s context window. Smart selection and rule‑based filtering keep the payload manageable. Large‑project support is under active development.
Q: Is there a way to automatically include only changed files?
A: Yes. Use lc-changed
to list files modified since the last context generation and feed them to lc-context
.
Q: What languages are supported for outline generation? A: The tool relies on tree‑sitter queries; most popular languages are covered, though C/C++ implementation extraction is not currently supported.
Q: How do I configure the MCP server for Claude Desktop?
A: Add the following JSON snippet to claude_desktop_config.json
(see Server Configuration below).
Server Configuration (MCP server entry for Claude Desktop):
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
LLM Context is a tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages .gitignore
patterns for smart file selection and provides both a streamlined clipboard workflow using the command line and direct LLM integration through the Model Context Protocol (MCP).
Note: This project was developed in collaboration with several Claude Sonnets - 3.5, 3.6 and 3.7 (and more recently Grok-3 as well), using LLM Context itself to share code during development. All code in the repository is human-curated (by me 😇, @restlessronin).
For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: LLM Context: Harnessing Vanilla AI Chats for Development
To see LLM Context in action with real-world examples and workflows, read: Full Context Magic - When AI Finally Understands Your Entire Project
Install LLM Context using uv:
uv tool install "llm-context>=0.3.0"
To upgrade to the latest version:
uv tool upgrade llm-context
Warning: LLM Context is under active development. Updates may overwrite configuration files prefixed with
lc-
. We recommend all configuration files be version controlled for this reason.
Add to 'claude_desktop_config.json':
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
Once configured, you can start working with your project in two simple ways:
Say: "I would like to work with my project" Claude will ask you for the project root path.
Or directly specify: "I would like to work with my project /path/to/your/project" Claude will automatically load the project context.
For optimal results, combine initial context through Claude's Project Knowledge UI with dynamic code access via MCP. This provides both comprehensive understanding and access to latest changes. See Full Context Magic for details and examples.
lc-init
(only needed once)lc-sel-files
.llm-context/curr_ctx.yaml
lc-context
(with optional flags: -p
for prompt, -u
for user notes)lc-context -p
to include instructionslc-clip-files
lc-init
: Initialize project configurationlc-set-rule <n>
: Switch rules (system rules are prefixed with "lc-")lc-sel-files
: Select files for inclusionlc-sel-outlines
: Select files for outline generationlc-context [-p] [-u] [-f FILE]
: Generate and copy context
-p
: Include prompt instructions-u
: Include user notes-f FILE
: Write to output filelc-prompt
: Generate project instructions for LLMslc-clip-files
: Process LLM file requestslc-changed
: List files modified since last context generationlc-outlines
: Generate outlines for code fileslc-clip-implementations
: Extract code implementations requested by LLMs (doesn't support C/C++)LLM Context provides advanced features for customizing how project content is captured and presented:
.gitignore
patternslc-clip-implementations
commandSee our User Guide for detailed documentation of these features.
Check out our comprehensive list of alternatives - the sheer number of tools tackling this problem demonstrates its importance to the developer community.
LLM Context evolves from a lineage of AI-assisted development tools:
I am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.
I am grateful for the help of Claude-3.5-Sonnet in the development of this project.
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "CyberChitta": { "command": "uvx", "args": [ "--from", "llm-context", "lc-mcp" ] } } }
Discover more MCP servers with similar functionality and use cases
by zed-industries
Provides real-time collaborative editing powered by Rust, enabling developers to edit code instantly across machines with a responsive, GPU-accelerated UI.
by cline
Provides autonomous coding assistance directly in the IDE, enabling file creation, editing, terminal command execution, browser interactions, and tool extension with user approval at each step.
by continuedev
Provides continuous AI assistance across IDEs, terminals, and CI pipelines, offering agents, chat, inline editing, and autocomplete to accelerate software development.
by github
Enables AI agents, assistants, and chatbots to interact with GitHub via natural‑language commands, providing read‑write access to repositories, issues, pull requests, workflows, security data and team activity.
by block
Automates engineering tasks by installing, executing, editing, and testing code using any large language model, providing end‑to‑end project building, debugging, workflow orchestration, and external API interaction.
by RooCodeInc
An autonomous coding agent that lives inside VS Code, capable of generating, refactoring, debugging code, managing files, running terminal commands, controlling a browser, and adapting its behavior through custom modes and instructions.
by lastmile-ai
A lightweight, composable framework for building AI agents using Model Context Protocol and simple workflow patterns.
by firebase
Provides a command‑line interface to manage, test, and deploy Firebase projects, covering hosting, databases, authentication, cloud functions, extensions, and CI/CD workflows.
by gptme
Empowers large language models to act as personal AI assistants directly inside the terminal, providing capabilities such as code execution, file manipulation, web browsing, vision, and interactive tool usage.