Open Standard

Prelude

Machine-readable codebase context for AI. Stop repeating yourself to LLMs—export your project's stack, patterns, and decisions in one command.

npm install -g prelude-context

# Initialize in your project

$ prelude init

# Generate AI-optimized context

$ prelude export

# Copied to clipboard ✓

Paste into Claude, ChatGPT, or any LLM

The Problem

Every AI conversation starts the same way: explaining your stack, architecture, patterns, and constraints. Again. And again.

Prelude captures this context once, updates it intelligently, and exports it in a format optimized for LLM comprehension.

What Prelude Captures

01

Technology Stack

Language, runtime, frameworks, database, ORM—auto-detected from your package files and configuration.

02

Architecture

Monorepo structure, component patterns, key directories, and how your codebase is organized.

03

Constraints

Must-use technologies, preferences, and rationale—the rules your team follows.

04

Decisions

Architecture Decision Records—why you chose Drizzle over Prisma, or Server Components over client state.

05

Changelog

Project timeline and recent changes, giving AI assistants temporal context about your work.

06

Work Sessions

Track development sessions with prelude watch for continuity across AI conversations.

Core Commands

prelude init

Analyzes your codebase and creates a .context/ directory with project metadata, stack info, architecture patterns, constraints, decisions, and changelog.

prelude export

Generates a markdown document optimized for LLMs—automatically copied to clipboard. Paste into any AI conversation.

prelude update

Re-analyzes and intelligently updates context. Preserves your manual edits while updating inferred data. Use --dry-run to preview.

prelude decision

Log architecture decisions with rationale. Opens your editor for documenting why you made specific technical choices.

prelude watch

Monitors file changes and logs your work session. Press Ctrl+C when done to save—great for session continuity.

Why Prelude

FeaturePreludeManualOther Tools
Automatic inference~
Standards-based
Human-readable
Machine-optimized
Version controlled~
Preserves manual edits

Use Cases

Starting Conversations

Paste your export to instantly give any LLM full context about your project. No more re-explaining your stack.

Debugging

Share context + error + file path. AI understands your patterns and constraints to give relevant solutions.

Architecture Decisions

Ask "Should I use X or Y?" and get answers that consider your existing patterns, not generic advice.

Team Onboarding

New team members can understand stack, patterns, and past decisions instantly via the .context/ directory.

FAQ

Should I commit .context/ to git?

Yes. The context is part of your project documentation. Exception: .context/*.session.json should be gitignored.

What happens to my manual edits?

They're preserved. Prelude tracks which fields are inferred vs. manually edited. Updates only change auto-detected data.

Does this work with non-JavaScript projects?

Yes. While the CLI is built with Node.js, the Prelude format works with any language. Inference currently focuses on JavaScript/TypeScript.

Can I use this with any LLM?

Yes. The export format is optimized for Claude, ChatGPT, Gemini, and any text-based AI assistant.

Design Principles

Human-readable first

All files are readable JSON/Markdown

AI-optimized

Structured for maximum LLM effectiveness

Standards-based

Open spec, not proprietary format

Zero lock-in

Edit files manually, use any tool

Incremental adoption

Works with partial information

Preserve intent

Never lose manual customizations

Ready to stop repeating yourself?

Install Prelude and export your first context in under a minute.

MIT License — Free and open source