by ggozad
Interact with Ollama models through an intuitive terminal UI, supporting persistent chats, system prompts, model parameters, and MCP tools integration.
Oterm provides a lightweight, cross‑platform terminal interface for Ollama, enabling users to chat with local LLMs, customise system prompts, adjust model parameters, and leverage MCP tools without running additional servers or front‑ends.
uvx oterm
(Alternatively, install via Homebrew, pip, or from source.)
2. Run – simply type oterm
in any supported terminal emulator (Linux, macOS, Windows).
3. Navigate – use the on‑screen menus to select a model, create or switch chat sessions, adjust settings, and invoke tools.
4. Persist – chats are stored in a SQLite database, preserving context and customisations across launches.
Q: Which operating systems are supported? A: Linux, macOS, and Windows with most terminal emulators.
Q: Do I need an internet connection? A: Only if the selected Ollama model requires remote resources; the client itself works offline.
Q: How are chats persisted? A: Chats, system prompts, and parameter settings are saved in a local SQLite database.
Q: Can I customise the appearance? A: Yes, Oterm supports multiple themes and UI styling options.
Q: How do I add custom tools? A: Use the MCP tools interface; Oterm can communicate with any MCP‑compatible server, including those you write yourself.
the terminal client for Ollama.
oterm
in your terminal.uvx oterm
See Installation for more details.
oterm
is now part of Homebrew!
The splash screen animation that greets users when they start oterm.
A view of the chat interface, showcasing the conversation between the user and the model.
The model selection screen, allowing users to choose and customize available models.
oterm using the
git
MCP server to access its own repo.
The image selection interface, demonstrating how users can include images in their conversations.
oterm supports multiple themes, allowing users to customize the appearance of the interface.
This project is licensed under the MIT License.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by sooperset
MCP Atlassian is a Model Context Protocol (MCP) server that integrates AI assistants with Atlassian products like Confluence and Jira. It enables AI to automate tasks, search for information, and manage content within Atlassian ecosystems.
by nbonamy
A desktop AI assistant that bridges dozens of LLM, image, video, speech, and search providers, offering chat, generative media, RAG, shortcuts, and extensible plugins directly from the OS.
by GongRzhe
Provides tools for creating, editing, and enhancing PowerPoint presentations through a comprehensive set of MCP operations powered by python-pptx.
by GongRzhe
Creates, reads, and manipulates Microsoft Word documents through a standardized interface for AI assistants, enabling rich editing, formatting, and analysis capabilities.
by GongRzhe
Gmail-MCP-Server is a Model Context Protocol (MCP) server that integrates Gmail functionalities into AI assistants like Claude Desktop. It enables natural language interaction for email management, supporting features like sending, reading, and organizing emails.
by nspady
google-calendar-mcp is a Model Context Protocol (MCP) server that integrates Google Calendar with AI assistants. It enables AI assistants to manage Google Calendar events, including creating, updating, deleting, and searching for events.
by runebookai
Provides a desktop interface to chat with local or remote LLMs, schedule tasks, and integrate Model Context Protocol servers without coding.
by vivekVells
mcp-pandoc is a Model Context Protocol (MCP) server designed for seamless document format conversion using Pandoc, supporting a wide range of formats like Markdown, HTML, PDF, DOCX, and more.
by abhiz123
An MCP (Model Context Protocol) server implementation that integrates Claude with Todoist, enabling natural language task management.