by elie222
An AI‑powered email assistant that automates inbox management, enabling users to reach inbox zero fast by handling replies, labeling, archiving, unsubscribing, and providing analytics through a plain‑text prompt configuration.
Inbox Zero provides an AI‑driven personal assistant combined with an open‑source email client. It connects to Gmail (and Outlook via Microsoft OAuth) and uses large language models to perform typical email tasks—drafting replies, categorizing messages, bulk unsubscribing, blocking cold emails, and generating activity analytics.
pnpm install
followed by pnpm run dev
(or turbo dev
for the monorepo).apps/web/.env.example
to apps/web/.env
and fill in required secrets (AUTH_SECRET, Google/Microsoft OAuth credentials, LLM provider keys, Redis/Postgres URLs, etc.).pnpm prisma migrate dev
.pnpm run dev
for development or pnpm run build && pnpm start
for production.GOOGLE_PUBSUB_TOPIC_NAME
and GOOGLE_PUBSUB_VERIFICATION_TOKEN
.Q: Which LLM providers are supported?
A: OpenAI, Anthropic (including Bedrock), Google Gemini, Groq, and local Ollama models. The provider is selectable in Settings.
Q: Do I need to run PostgreSQL and Redis myself?
A: Yes. You can use Upstash managed services or run local containers via docker‑compose up -d
.
Q: Can I run the app entirely locally without cloud services?
A: You can host the web app locally, but Google Pub/Sub (or an equivalent push mechanism) is required for real‑time email watching. Without it, you must manually trigger /api/watch/all
via cron.
Q: Is there a free tier?
A: The open‑source code is free. Premium features (e.g., unlimited AI calls) require an admin account flagged in .env
(ADMINS=you@example.com
).
Q: How do I add custom actions?
A: Define them in the prompt file used by the AI assistant or create webhooks and reference them in the prompt.
There are two parts to Inbox Zero:
If you're looking to contribute to the project, the email client is the best place to do this.
Learn more in our docs.
![]() |
![]() |
---|---|
AI Assistant | Reply Zero |
![]() |
![]() |
Gmail client | Bulk Unsubscriber |
To request a feature open a GitHub issue, or join our Discord.
We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.
Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.
Make sure you have the above installed before starting.
The external services that are required are (detailed setup instructions below):
Create your own .env
file from the example supplied:
cp apps/web/.env.example apps/web/.env
cd apps/web
pnpm install
Set the environment variables in the newly created .env
. You can see a list of required variables in: apps/web/env.ts
.
The required environment variables:
AUTH_SECRET
-- can be any random string (try using openssl rand -hex 32
for a quick secure random string)
EMAIL_ENCRYPT_SECRET
-- Secret key for encrypting OAuth tokens (try using openssl rand -hex 32
for a secure key)
EMAIL_ENCRYPT_SALT
-- Salt for encrypting OAuth tokens (try using openssl rand -hex 16
for a secure salt)
NEXT_PUBLIC_BASE_URL
-- The URL where your app is hosted (e.g., http://localhost:3000
for local development or https://yourdomain.com
for production).
INTERNAL_API_KEY
-- A secret key for internal API calls (try using openssl rand -hex 32
for a secure key)
UPSTASH_REDIS_URL
-- Redis URL from Upstash. (can be empty if you are using Docker Compose)
UPSTASH_REDIS_TOKEN
-- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)
When using Vercel with Fluid Compute turned off, you should set MAX_DURATION=300
or lower. See Vercel limits for different plans here.
GOOGLE_CLIENT_ID
-- Google OAuth client ID. More info hereGOOGLE_CLIENT_SECRET
-- Google OAuth client secret. More info hereGo to Google Cloud. Create a new project if necessary.
Create new credentials:
If the banner shows up, configure consent screen (if not, you can do this later)
Get Started
.External
Create
.Create new credentials:
+Create Credentials
button. Choose OAuth Client ID.Application Type
, Choose Web application
http://localhost:3000
Authorized redirect URIs
enter:http://localhost:3000/api/auth/callback/google
http://localhost:3000/api/google/linking/callback
Create
.Update .env file:
GOOGLE_CLIENT_ID
GOOGLE_CLIENT_SECRET
Update scopes
Data Access
in the left sidebar (or click link above)Add or remove scopes
Manually add scopes
box:https://www.googleapis.com/auth/userinfo.profile
https://www.googleapis.com/auth/userinfo.email
https://www.googleapis.com/auth/gmail.modify
https://www.googleapis.com/auth/gmail.settings.basic
https://www.googleapis.com/auth/contacts
Update
Save
in the Data Access page.Add yourself as a test user
Test users
section, click +Add users
Save
MICROSOFT_CLIENT_ID
-- Microsoft OAuth client IDMICROSOFT_CLIENT_SECRET
-- Microsoft OAuth client secretGo to Microsoft Azure Portal. Create a new Azure Active Directory app registration:
Navigate to Azure Active Directory
Go to "App registrations" in the left sidebar or search it in the searchbar
Click "New registration"
http://localhost:3000/api/auth/callback/microsoft
http://localhost:3000/api/outlook/linking/callback
Get your credentials:
MICROSOFT_CLIENT_ID
MICROSOFT_CLIENT_SECRET
Configure API permissions:
In the "Manage" menu click "API permissions" in the left sidebar
Click "Add a permission"
Select "Microsoft Graph"
Select "Delegated permissions"
Add the following permissions:
Click "Add permissions"
Click "Grant admin consent" if you're an admin
Update your .env file with the credentials:
MICROSOFT_CLIENT_ID=your_client_id_here
MICROSOFT_CLIENT_SECRET=your_client_secret_here
You need to set an LLM, but you can use a local one too:
For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:
OLLAMA_BASE_URL=http://localhost:11434/api
NEXT_PUBLIC_OLLAMA_MODEL=phi3
Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api
as the base URL. You might also need to set OLLAMA_HOST
to 0.0.0.0
in the Ollama configuration file.
You can select the model you wish to use in the app on the /settings
page of the app.
If you are using local ollama, you can set it to be default:
DEFAULT_LLM_PROVIDER=ollama
If this is the case you must also set the ECONOMY_LLM_PROVIDER
environment variable.
We use Postgres for the database. For Redis, you can use Upstash Redis or set up your own Redis instance.
You can run Postgres & Redis locally using docker-compose
docker-compose up -d # -d will run the services in the background
To run the migrations:
pnpm prisma migrate dev
To run the app locally for development (slower):
pnpm run dev
Or from the project root:
turbo dev
To build and run the app locally in production mode (faster):
pnpm run build
pnpm start
Open http://localhost:3000 to view the app in your browser.
Many features are available only to premium users. To upgrade yourself, make yourself an admin in the .env
: ADMINS=hello@gmail.com
Then upgrade yourself at: http://localhost:3000/admin.
Follow instructions here.
Set env var GOOGLE_PUBSUB_TOPIC_NAME
.
When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN
or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN
where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN
in your .env
file to be the value of TOKEN
.
To run in development ngrok can be helpful:
ngrok http 3000
# or with an ngrok domain to keep your endpoint stable (set `XYZ`):
ngrok http --domain=XYZ.ngrok-free.app 3000
And then update the webhook endpoint in the Google PubSub subscriptions dashboard.
To start watching emails visit: /api/watch/all
Set a cron job to run these: The Google watch is necessary. Others are optional.
"crons": [
{
"path": "/api/watch/all",
"schedule": "0 1 * * *"
},
{
"path": "/api/resend/summary/all",
"schedule": "0 16 * * 1"
},
{
"path": "/api/reply-tracker/disable-unused-auto-draft",
"schedule": "0 3 * * *"
}
]
Here are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel vercel.json
. Open to PRs if you find a fix for that.
When building the Docker image, you must specify your NEXT_PUBLIC_BASE_URL
as a build argument. This is because Next.js embeds NEXT_PUBLIC_*
variables at build time, not runtime.
# For production with your custom domain
docker build \
--build-arg NEXT_PUBLIC_BASE_URL="https://your-domain.com" \
-t inbox-zero \
-f docker/Dockerfile.prod .
# For local development (default)
docker build -t inbox-zero -f docker/Dockerfile.prod .
After building, run the container with your runtime secrets:
docker run -p 3000:3000 \
-e DATABASE_URL="your-database-url" \
-e AUTH_SECRET="your-auth-secret" \
-e GOOGLE_CLIENT_ID="your-google-client-id" \
-e GOOGLE_CLIENT_SECRET="your-google-client-secret" \
# ... other runtime environment variables
inbox-zero
Important: If you need to change NEXT_PUBLIC_BASE_URL
, you must rebuild the Docker image. It cannot be changed at runtime.
For more detailed Docker build instructions and security considerations, see docker/DOCKER_BUILD_GUIDE.md.
You can view open tasks in our GitHub Issues. Join our Discord to discuss tasks and check what's being worked on.
ARCHITECTURE.md explains the architecture of the project (LLM generated).
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by makenotion
Provides a remote Model Context Protocol server for the Notion API, enabling OAuth‑based installation and optimized toolsets for AI agents with minimal token usage.
by sooperset
MCP Atlassian is a Model Context Protocol (MCP) server that integrates AI assistants with Atlassian products like Confluence and Jira. It enables AI to automate tasks, search for information, and manage content within Atlassian ecosystems.
by ggozad
Interact with Ollama models through an intuitive terminal UI, supporting persistent chats, system prompts, model parameters, and MCP tools integration.
by nbonamy
A desktop AI assistant that bridges dozens of LLM, image, video, speech, and search providers, offering chat, generative media, RAG, shortcuts, and extensible plugins directly from the OS.
by GongRzhe
Provides tools for creating, editing, and enhancing PowerPoint presentations through a comprehensive set of MCP operations powered by python-pptx.
by GongRzhe
Creates, reads, and manipulates Microsoft Word documents through a standardized interface for AI assistants, enabling rich editing, formatting, and analysis capabilities.
by GongRzhe
Gmail-MCP-Server is a Model Context Protocol (MCP) server that integrates Gmail functionalities into AI assistants like Claude Desktop. It enables natural language interaction for email management, supporting features like sending, reading, and organizing emails.
by nspady
google-calendar-mcp is a Model Context Protocol (MCP) server that integrates Google Calendar with AI assistants. It enables AI assistants to manage Google Calendar events, including creating, updating, deleting, and searching for events.
by runebookai
Provides a desktop interface to chat with local or remote LLMs, schedule tasks, and integrate Model Context Protocol servers without coding.