A VS Code extension that analyzes any open codebase and generates a structured
PROJECT_OVERVIEW.md
at the project root. Detects languages, frameworks, entry points, and directory structure — entirely
offline. Optionally uses GitHub Copilot to add a plain-English narrative summary.
Every developer has opened an unfamiliar repository and immediately felt lost. The questions are always the same:
The answers live scattered across package.json,
requirements.txt,
Cargo.toml, and
folder structures. Reading all of that on a fresh checkout takes time that should go toward actual work.
This extension eliminates that friction and produces an artifact you can commit alongside the code so
the next developer has it ready immediately.
A multi-stage analysis pipeline runs entirely on your local machine:
Recursively scans the workspace from the root, respecting configurable depth and exclude-directory settings.
Reads package.json,
Cargo.toml,
go.mod,
pyproject.toml,
and more to extract dependencies and metadata.
Matches dependency names and config files against known patterns — React, Vue, Django, Laravel, Express, and many more.
Identifies likely entry files using common
conventions: src/index.ts,
main.go,
app.py,
and others.
Finds the primary source directory and renders it as a depth-limited ASCII tree, directories sorted before files.
Optionally sends the generated markdown to GitHub Copilot or OpenAI for a plain-English summary. Steps 1–6 run fully offline — AI is best-effort and skipped on timeout.
All commands are available via the Command Palette
(Cmd+Shift+P)
under the Project Analysis category.
The primary command. Runs the full analysis
pipeline and writes PROJECT_OVERVIEW.md
to the workspace root. Prompts to overwrite if the file already exists. The progress
notification is cancellable — clicking Cancel skips the AI summary and writes the static
overview immediately.
Identical to the primary command but
silently overwrites any existing PROJECT_OVERVIEW.md
without prompting. Useful for re-running analysis after the project has changed.
Opens an interactive Q&A session
powered by an LLM. Loads the existing PROJECT_OVERVIEW.md
as grounding context for the conversation. The agent runs in a loop until you type exit.
Example questions
Run the command in any open folder and you'll get a
PROJECT_OVERVIEW.md
like this:
# Project Overview
## 🤖 AI Summary
A server-rendered e-commerce application built on Next.js with a Stripe
integration. Uses TypeScript throughout and Prisma for database access.
---
## Basic Information
**Name:** my-app
**Version:** 1.2.0
**Type:** Application (Build-enabled)
**Primary Language:** TypeScript
## Frameworks & Libraries
- React 18
- Express 4
- Prisma
## Entry Points
- `src/index.ts`
- `src/server.ts`
## 📁 Source Structure
src/
├── index.ts
├── server.ts
├── components/
│ ├── Header.tsx
│ └── Footer.tsx
└── routes/
└── api.ts
---
*Generated by Explain This Project — 2026-03-26T12:00:00.000Z*
The AI Summary section is omitted when Copilot is unavailable, the request times out, or you click Cancel.
All settings are under the explainThisProject
namespace. Configure in VS Code Settings or .vscode/settings.json.
{
"explainThisProject.llmProvider": "copilot",
"explainThisProject.openaiApiKey": "",
"explainThisProject.aiSummaryTimeoutSeconds": 30,
"explainThisProject.includeDevDependencies": true,
"explainThisProject.maxDirectoryDepth": 3,
"explainThisProject.excludeDirectories": [
"node_modules", ".git", "dist", "build", "coverage"
]
}
| Setting | Type | Default | Description |
|---|---|---|---|
llmProvider
|
"copilot" | "openai"
|
"copilot"
|
Which LLM backend to use for AI summary and Ask Questions |
openaiApiKey
|
string
|
""
|
OpenAI API key. Required only when llmProvider
is "openai".
Set in user settings, never workspace settings. |
aiSummaryTimeoutSeconds
|
number (5–120)
|
30
|
How long to wait for the AI summary before skipping it and writing the static file anyway |
includeDevDependencies
|
boolean
|
true
|
Whether to include devDependencies
in the analysis output |
maxDirectoryDepth
|
number (1–10)
|
3
|
Maximum depth for the file system walk |
excludeDirectories
|
string[]
|
["node_modules", ...]
|
Directory names to skip entirely during the walk |
Fully Analyzed (manifest-based)
| Language | Manifest |
|---|---|
| JavaScript / TypeScript |
package.json
|
| Python |
requirements.txt,
pyproject.toml
|
| Rust |
Cargo.toml
|
| Go |
go.mod
|
| PHP | composer.json
|
Basic Detection (file extensions)
Detected by source file extensions. No manifest parsing is performed for these languages.
JS/TS Framework Detection
React, Vue, Svelte, Next.js, Nuxt, Angular, Express, Fastify, NestJS, Vite, Webpack, Jest, Vitest, and more.
llmProvider
is "openai"
Static analysis (without AI summary) works entirely offline — no account or API key required.
The core analysis runs entirely on your machine. No source code is ever transmitted to any external service.
openaiApiKey
in user settings, not workspace settings, to avoid committing itInstall from the VS Code Marketplace — free, open source, no account required for the core analysis.