by AdbC99
Provides repeatable, reliable lookup of biblical verses for large language models via an MCP server and an OpenAI‑compatible completions API, enabling consistent and reproducible results for research and educational applications.
Ai Bible offers a model‑context‑protocol (MCP) server that delivers precise Bible verse retrieval for LLMs, along with a Dockerized service that mimics the OpenAI completions endpoint. It aims to make biblical data accessible to AI systems in a reliable, repeatable manner for scholarly and teaching purposes.
npm run build
if a script exists).docker build -f completions/Dockerfile -t mcp-server .
docker run -p 8002:8000 mcp-server
http://localhost:8002/docs
to test the get-verse
endpoint.mcp-server.stdio.js
file located in the build
folder.http://ai-bible.com
for pocket‑Bible access.Q: Do I need an OpenAI API key? A: No, the Docker container provides a self‑hosted completions endpoint that works offline.
Q: Which languages are supported?
A: The API currently accepts english
as the language parameter; additional languages can be added via data files.
Q: Can I use other LLMs besides Claude? A: Yes, any model that can call an MCP server or the OpenAI‑compatible endpoint can be configured.
Q: How is the data licensed?
A: The source code is GPL v3; biblical text files have their own licenses detailed in LICENCE.md
.
Q: Where can I see the API documentation?
A: Visit http://localhost:8002/docs
after starting the Docker container.
ai-Bible is a project that explores the use of AI within a context of interpreting and understanding biblical text. This repository contains mcp-servers and a container for compatibility with the openai completions API that support an AI or Large Language Model reliably and repeatably lookup data so that it can be represented in different forms for research or educational purposes with some confidence that results will be reproducable and reasonable.
For web accessible front end as a pocket bible see http://ai-bible.com
The mcp-server contains the current implementation of a server for repeatedly and reliably retrieving bible verses when using LLMs. Claude Desktop can be configured to use the mcp-server.stdio.js file built in the build folder of this project as an mcp-server.
See the README.md in that subfolder for detailed information.
The docker container wraps the mcp server up using mcpo in order to turn it into server supporting the openai completions api. Run these commands from the project root after building the mcp-server.
docker build -f completions/Dockerfile -t mcp-server .
docker run -p 8002:8000 mcp-server
You can check it is running be checking the swagger api page:
http://localhost:8002/docs
Try the get-verse api with parameters:
{
"reference": ["Gen.1.1", "Gen.2.1"],
"language": "english"
}
One way to access the completions api is via Open WebUI and then you can do everything locally with a LLM via Ollama with a model such as llama 3.1 8b, see:
https://docs.openwebui.com/getting-started/quick-start/
Contributions are welcome! Please feel free to submit a pull request or open an issue for any enhancements or bug fixes.
This project source code is under the GNU GPL v3 Licence. Within the project there are data files that come under different licences. See the file LICENCE.md for details of the GPL licence.
Please log in to share your review and rating for this MCP.
Discover more MCP servers with similar functionality and use cases
by topoteretes
Enables AI agents to store, retrieve, and reason over past conversations, documents, images, and audio transcriptions by loading data into graph and vector databases with minimal code.
by basicmachines-co
Basic Memory is a local-first knowledge management system that allows users to build a persistent semantic graph from conversations with AI assistants. It addresses the ephemeral nature of most LLM interactions by providing a structured, bi-directional knowledge base that both humans and LLMs can read and write to.
by smithery-ai
mcp-obsidian is a connector that allows Claude Desktop to read and search an Obsidian vault or any directory containing Markdown notes.
by qdrant
Provides a semantic memory layer on top of the Qdrant vector search engine, enabling storage and retrieval of information via the Model Context Protocol.
by GreatScottyMac
A database‑backed MCP server that stores project decisions, progress, architecture, custom data, and vector embeddings, allowing AI assistants in IDEs to retrieve precise, up‑to‑date context for generation tasks.
by StevenStavrakis
Enables AI assistants to read, create, edit, move, delete, and organize notes and tags within an Obsidian vault.
by mem0ai
Provides tools to store, retrieve, and semantically search coding preferences via an SSE endpoint for integration with MCP clients.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, retrieval, RAG, and publishing capabilities across a wide range of data sources and tools.
by chroma-core
Provides vector, full‑text, and metadata‑based retrieval powered by Chroma for LLM applications, supporting in‑memory, persistent, HTTP, and cloud clients as well as multiple embedding functions.