@neuledge/context
Local-first docs for AI agents
Context indexes library documentation into portable SQLite files and serves them via MCP — no cloud, no rate limits, sub-10 ms queries. Your AI coding assistant gets accurate, version-specific docs every time.
<10ms
Query latency
0
Network calls at runtime
∞
Queries per hour
Free
Open source (Apache 2.0)
How It Works
Three steps to give your AI agent accurate, version-specific documentation.
1. Add a Library
Point Context at any Git repo. It shallow-clones the docs, lets you pick a version tag, and parses every Markdown file into semantically chunked sections.
2. Index into SQLite
All chunks are indexed with FTS5 full-text search and BM25 ranking
into a single portable .db file.
Build once, share with your entire team.
3. Query via MCP
Start the MCP server and your AI coding assistant — Claude, Cursor, VS Code Copilot — gets sub-10 ms access to accurate, version-specific documentation.
Why Local-First
Cloud doc services charge per query and enforce rate limits. When you hit them mid-session, your AI falls back to stale training data. Context eliminates that failure mode.
No Rate Limits
Everything runs locally. Your AI agent can query docs as often as it needs — hundreds of lookups per session — without ever being throttled or cut off.
Works Offline
Once you've built a .db file, no
internet is required. Airplane mode, air-gapped networks, or just a
flaky connection — Context keeps working.
Version-Specific Docs
Pick the exact version tag you're working with. No more getting Next.js 14 patterns when your project uses Next.js 16.
100% Private
No cloud service sees your queries, your project structure, or your internal documentation. Everything stays on your machine.
Portable & Shareable
Each library compiles into a single .db file.
Check it into your repo or share on a drive — everyone gets the same
indexed docs with zero setup.
Internal Docs Too
Index proprietary wikis, runbooks, or design system docs. Your AI assistant gets grounded access to company knowledge — for free, with no token fees.
Context vs. Cloud Services
A side-by-side comparison with typical cloud documentation services.
| Context | Cloud Services | |
|---|---|---|
| Price | Free | $10+/month |
| Rate limits | None | 60 req/hour typical |
| Latency | <10ms | 100–500ms |
| Works offline | Yes | No |
| Privacy | 100% local | Cloud-processed |
| Private repos | Free | $15/1M tokens |
| Version pinning | Exact tags | Latest only |
Use Cases
Any workflow where an AI agent needs accurate, up-to-date library documentation.
AI Coding Assistants
Give Claude, Cursor, or Copilot instant access to the exact framework docs for your stack — no hallucinated APIs, no outdated patterns.
Team Onboarding
Ship pre-built .db files with your
project so every developer's AI assistant has the same accurate docs
from day one.
CI/CD Pipelines
Use Context in automated workflows — code review bots, doc validation, or AI-powered migration scripts that need version-accurate references.
Internal Documentation
Index your own internal docs, wikis, or runbooks into a
.db file and give your AI assistant
grounded access to company knowledge.
Stay up to date
Get the latest articles on AI, LLMs, and data automation delivered to your inbox.
Thanks for subscribing!
Something went wrong. Please try again.
Get Started
No sign-up, no API keys, no cloud dependency.
Install
Add documentation
Start the MCP server
Or use without installing
Want a hands-on walkthrough? Read our step-by-step tutorial →