@neuledge/context

Local-first docs for AI agents

Context indexes library documentation into portable SQLite files and serves them via MCP — no cloud, no rate limits, sub-10 ms queries. Your AI coding assistant gets accurate, version-specific docs every time.

$ npx @neuledge/context install npm/next 15.0
$ npx @neuledge/context install pip/django

116+

Pre-built packages

<10ms

Query latency

0

Network calls at runtime

Queries per hour

Free

Open source (Apache 2.0)

How It Works

Three steps to give your AI agent accurate, version-specific documentation.

1. Install a Package

Install from the community registry — 116+ pre-built packages across npm, pip, and maven. Context parses Markdown, reStructuredText, and AsciiDoc into semantically chunked sections. Or build from any Git repo with context add.

$ context install npm/react

2. Index into SQLite

All chunks are indexed with FTS5 full-text search and BM25 ranking into a single portable .db file. Build once, share with your entire team.

$ ls .context/nextjs@16.0.0.db

3. Query via MCP

Start the MCP server and your AI coding assistant — Claude, Cursor, VS Code Copilot — gets sub-10 ms access to accurate, version-specific documentation. Use --http for team-wide sharing.

$ npx @neuledge/context serve --http 3000

Why Local-First

Cloud doc services charge per query and enforce rate limits. When you hit them mid-session, your AI falls back to stale training data. Context eliminates that failure mode. Learn what local-first documentation is and why it matters →

No Rate Limits

Everything runs locally. Your AI agent can query docs as often as it needs — hundreds of lookups per session — without ever being throttled or cut off.

Works Offline

Once you've built a .db file, no internet is required. Airplane mode, air-gapped networks, or just a flaky connection — Context keeps working.

Version-Specific Docs

Pick the exact version tag you're working with. No more getting Next.js 14 patterns when your project uses Next.js 16.

Learn why version pinning matters →

100% Private

No cloud service sees your queries, your project structure, or your internal documentation. Everything stays on your machine.

Portable & Shareable

Each library compiles into a single .db file. Check it into your repo, share on a drive, or run context serve --http to share a single server with your entire team — every AI assistant connects to the same docs.

Any Documentation Format

Context parses Markdown, reStructuredText, and AsciiDoc — covering Python, Java, and any ecosystem with structured documentation. Not just JavaScript anymore.

Internal Docs Too

Index proprietary wikis, runbooks, or design system docs. Your AI assistant gets grounded access to company knowledge — for free, with no token fees.

Context vs. Cloud Services

A side-by-side comparison with typical cloud documentation services.

Context Cloud Services
Price Free $10+/month
Rate limits None 60 req/hour typical
Latency <10ms 100–500ms
Works offline Yes No
Privacy 100% local Cloud-processed
Private repos Free $15/1M tokens
Version pinning Exact tags Latest only

Use Cases

Any workflow where an AI agent needs accurate, up-to-date library documentation.

AI Coding Assistants

Give Claude, Cursor, or Copilot instant access to the exact framework docs for your stack — React, Django, Spring Boot, or any of the 116+ packages. No hallucinated APIs, no outdated patterns.

Team Onboarding

Ship pre-built .db files with your project or run context serve --http so every developer's AI assistant connects to the same accurate docs from day one.

CI/CD Pipelines

Use Context in automated workflows — code review bots, doc validation, or AI-powered migration scripts that need version-accurate references.

Internal Documentation

Index your own internal docs, wikis, or runbooks into a .db file and give your AI assistant grounded access to company knowledge.

Stay up to date

Get the latest articles on AI, LLMs, and data automation delivered to your inbox.

Get Started

No sign-up, no API keys, no cloud dependency.

Install

$ npm install -g @neuledge/context

Install a package from the registry

$ context install npm/react

Or build from any Git repo

$ context add https://github.com/vercel/next.js

Start the MCP server

$ context mcp

For team sharing: context serve --http 3000

Or use without installing

$ npx @neuledge/context install npm/react