EQBook is a NotebookLM-style AI workspace that runs 100% on your device. Add PDFs, documents, audio and video — then ask questions and get cited answers from locally-run Gemma 4 models. No cloud. No accounts. Your data never leaves.
Free · All inference on-device · No account required
Most AI tools send your documents to remote servers. EQBook is different: every piece of inference — reading, embedding, summarising, answering — happens locally on your Apple Silicon chip.
Import PDFs, Word docs, plain text, Markdown files, or audio/video. EQBook copies files locally — nothing is uploaded anywhere.
Audio and video are transcribed by Whisper. All text is chunked, embedded via MiniLM, and stored in a local vector database.
Type your question. EQBook retrieves the most relevant passages (RAG) and streams a Gemma 4 answer with clickable inline citations.
Browse key topics and summaries in the Insights panel. Semantic search lets you scan every source at once by meaning, not just keyword.
EQBook automatically detects your device memory and selects the appropriate Gemma 4 variant at first launch. You can always upgrade manually.
| Device | RAM | Model | Size | Context |
|---|---|---|---|---|
| MacBook Air / Mac mini | 8–16 GB | gemma-4-e4b-it-4bit |
~5.2 GB | 128K tokens |
| Mac ≥ 16 GB Recommended | ≥ 16 GB | gemma-4-26b-a4b-it-4bit |
~15.6 GB | 256K tokens |
Models are downloaded once at first launch and stored locally on your device. Additional on-device models are currently under evaluation and will be made available in future releases.
Start for free. Upgrade when your team needs enterprise features.
Forever free. No credit card.
Free download. No account. Runs entirely on your Apple Silicon Mac.