Getting Started with @neuledge/context

A step-by-step tutorial for setting up @neuledge/context — the local-first MCP documentation server that gives your AI coding assistant accurate, version-specific docs.

Getting Started with @neuledge/context

Your AI coding assistant just suggested getServerSideProps for your Next.js 16 project. That API was deprecated two major versions ago. Yesterday it generated Tailwind classes that don’t exist. Last week it used the old AI SDK callback pattern instead of the new agent loop API.

This isn’t a model problem — it’s a data problem. Your assistant is working from training data that’s months or years out of date. When it doesn’t have the right docs, it fills the gap with confident-sounding fiction.

@neuledge/context fixes this. It indexes library documentation into local SQLite files and serves them to your AI assistant via MCP (Model Context Protocol). No cloud service, no rate limits, sub-10ms queries. This tutorial walks you through setting it up from scratch with a real project.

Prerequisites

Before you start, you’ll need:

  • Node.js 18+ — check with node --version
  • An AI coding assistant that supports MCP — Claude Code, Cursor, VS Code with Copilot, Windsurf, or any MCP-compatible client
  • A project you’re working on — we’ll use a Next.js + Tailwind CSS stack as our example, but Context works with any library that has Markdown docs

Installing Context

Install Context globally so it’s available across all your projects:

npm install -g @neuledge/context

This installs the context CLI tool. There’s no background daemon, no system service — just a command-line tool that runs when you call it.

If you prefer not to install globally, you can use npx instead. Every command in this tutorial works with the npx @neuledge/context prefix:

npx @neuledge/context --version

Adding your first library

Let’s index the Next.js documentation. Point Context at the GitHub repo:

context add https://github.com/vercel/next.js

Context will:

  1. Shallow-clone the repo — only the docs, not the full git history
  2. Show available version tags — pick the version that matches your project (e.g., v16.0.0)
  3. Detect the docs directory — it scans for docs/, documentation/, or doc/ automatically
  4. Parse every Markdown file — extracting frontmatter, splitting content into semantically meaningful chunks by H2 headings (~800 tokens per chunk)
  5. Index into SQLite — full-text search with FTS5 and BM25 ranking, stored in a single .db file

The result is a portable database file at ~/.context/packages/nextjs@16.0.0.db. That file contains every piece of Next.js 16 documentation, pre-indexed and ready for instant queries.

Want to pin a specific version without the interactive prompt? Use the --tag flag:

context add https://github.com/vercel/next.js --tag v16.0.0

Adding multiple libraries

A real project uses more than one library. Let’s add Tailwind CSS:

context add https://github.com/tailwindlabs/tailwindcss

Pick the version tag that matches your project, and Context creates another .db file. Each library gets its own database — clean, isolated, and independently updatable.

To see everything you’ve indexed:

context list

You’ll see something like:

nextjs@16.0.0          ~/.context/packages/nextjs@16.0.0.db
tailwindcss@4.0.0      ~/.context/packages/tailwindcss@4.0.0.db

Add as many libraries as your project uses. The Vercel AI SDK, React, your component library — if it has Markdown docs in a Git repo, Context can index it.

Connecting to your editor

Context serves docs via MCP — the Model Context Protocol, an open standard backed by Anthropic, OpenAI, Google, and Microsoft. Here’s how to connect it to your editor.

Claude Code

One command:

claude mcp add context -- npx @neuledge/context mcp

Cursor

Create .cursor/mcp.json in your project root:

{
  "mcpServers": {
    "context": {
      "command": "npx",
      "args": ["@neuledge/context", "mcp"]
    }
  }
}

VS Code / Copilot

Add to .vscode/settings.json:

{
  "mcp": {
    "servers": {
      "context": {
        "command": "npx",
        "args": ["@neuledge/context", "mcp"]
      }
    }
  }
}

Requires VS Code 1.99+ with the GitHub Copilot extension.

Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "context": {
      "command": "npx",
      "args": ["@neuledge/context", "mcp"]
    }
  }
}

For other MCP clients, the server command is npx @neuledge/context mcp using stdio transport. See our integrations page for the full list.

Using Context in practice

Once connected, your AI assistant automatically has access to the resolve tool — it can search your indexed docs whenever it needs accurate information.

Here’s the difference in action. Say you ask your assistant: “How do I create a middleware in Next.js 16 that redirects unauthenticated users?”

Without Context: The assistant relies on training data. It might generate a middleware.ts file using the old NextResponse.redirect() pattern with the wrong import path, or reference a configuration option that was renamed two versions ago.

With Context: The assistant queries your indexed Next.js 16 docs, finds the current middleware documentation, and generates code that matches the exact API of the version you’re using. The correct imports, the current configuration format, the right patterns.

The same applies to Tailwind. Ask about a utility class and the assistant pulls from your indexed v4 docs instead of guessing based on v3 training data.

This happens transparently — your assistant calls the resolve tool when it needs docs, gets results in under 10ms, and uses them to ground its response. No extra prompting needed.

Tips for power users

Pin exact versions

Always match the indexed version to what’s in your package.json. If you’re on Next.js 16.0.0, index that exact tag:

context add https://github.com/vercel/next.js --tag v16.0.0

When you upgrade, add the new version. Old .db files stay around so you can switch back if needed.

Index your own docs

Context works with any Git repo that has Markdown files — including yours:

context add ./docs --name my-project --pkg-version 1.0

Index your internal API docs, runbooks, or design system documentation. Your AI assistant gets grounded access to company knowledge, completely private, no cloud service involved.

Share .db files with your team

Each documentation package is a single, self-contained .db file. You can share them:

# Build and export to a specific location
context add https://github.com/your-org/design-system \
  --name design-system --pkg-version 3.1 --save ./packages/

# Teammates install the pre-built package instantly
context add ./packages/design-system@3.1.db

Commit .db files to your repo, upload them to S3, or put them on a shared drive. No build step on the receiving end — the pre-indexed database installs instantly.

Update when a new version releases

When a library you depend on releases a new version:

context add https://github.com/vercel/next.js --tag v16.1.0

The old version’s .db file stays intact. You can keep multiple versions indexed simultaneously.

What’s happening under the hood

If you’re curious about the internals: Context uses SQLite with FTS5 (full-text search) and BM25 ranking. When your AI assistant queries for “middleware authentication,” the search engine:

  1. Tokenizes the query using Porter stemming — so “authenticating” matches “authentication”
  2. Runs FTS5 search across all indexed chunks
  3. Ranks results with BM25 — section titles weighted 10x, doc titles 5x over body content
  4. Filters low-relevance results — anything below 50% of the top score gets dropped
  5. Merges adjacent chunks — so your assistant sees coherent documentation sections, not fragments
  6. Caps at a token budget — keeping the response focused without flooding the context window

Total latency: under 10ms. Compare that to 100-500ms for a cloud round-trip.

If you also need live data access beyond static documentation — product catalogs, pricing, inventory — check out @neuledge/graph, which provides a semantic data layer for AI agents with pre-cached, sub-100ms responses.

Get started now

Install Context, index the docs for your current project, and connect to your editor. The whole setup is three commands:

npm install -g @neuledge/context
context add https://github.com/vercel/next.js
claude mcp add context -- npx @neuledge/context mcp

Your AI coding assistant just went from hallucinating outdated APIs to having instant, offline access to the exact documentation it needs.