Back to Chronicles
April 6, 20265 min read

Scanning Your Dev Environment in 200ms

Every new machine, every onboarding, every debugging session starts the same way: which tools are actually installed, and what versions? We built a CLI that answers in under a second.


Greyforge Thesis

If you cannot audit your dev environment in one command, you do not know what you are shipping from.

Every CI failure, every works-on-my-machine moment, every onboarding delay traces back to unknown tool state. Make it visible.

>_The problem is not missing tools. It is not knowing what you have.

Every developer has been there. A build fails because the wrong Python is first on PATH. A teammate cannot reproduce a bug because they have Node 18 while you have Node 22. A CI runner silently uses an ancient version of Docker. The information exists on every machine, but no single command surfaces it.

The usual response is a shell script. One-off, fragile, hardcoded to whatever the author needed that week. It checks five tools, misses twenty, and parses version output with brittle string splits that break when the format changes. Greyforge ran exactly such a script internally. It worked until it did not.

devcap replaces that script with a proper tool. It scans over 100 development tools across 14 categories, extracts version numbers with a battle-tested regex, and outputs structured results in text, JSON, or Markdown. Zero dependencies. Python 3.11+. One command.

>_What 200ms buys you.

The scan runs every tool check in parallel using a thread pool. On a typical workstation with 40+ tools installed, the full audit completes in under 200 milliseconds. That speed matters because it means the scan can run at the start of CI pipelines, onboarding scripts, or debugging sessions without anyone noticing the cost.

The engine uses shutil.which() for detection, which respects PATH ordering and avoids false positives from vendored binaries buried in node_modules or .venv directories. Tools with platform-specific names like fd/fdfind or bat/batcat are handled through an alias system rather than platform detection hacks.

Languages

python3, node, go, rustc, java

Package Managers

pip, npm, cargo, brew

Version Control

git, gh, git-lfs

Containers

docker, podman, kubectl, helm

Editors & IDEs

code, nvim, cursor

Linters & Formatters

ruff, eslint, prettier, black

>_Profiles turn a scanner into a gate.

A full scan is useful for auditing. But the real operational value comes from profiles. A profile is a TOML file that declares which tools matter for a given context and which are required. Run devcap check --profile python-dev and the tool exits with code 1 if Python, pip, or git are missing. Use it as a CI gate, an onboarding validator, or a pre-deploy sanity check.

Six profiles ship out of the box: full, python-dev, node-dev, rust-dev, devops, and sysadmin. Custom profiles use the same TOML format, so teams can define their own stack requirements and check them in alongside their code. The profile inherits tool detection config from a built-in registry, so a three-line TOML entry is enough to add a custom tool.

>_Design under constraint.

devcap follows the same release discipline as memory-quality-gate: zero runtime dependencies, deterministic behavior, and a clean boundary between the useful public tool and the private systems that motivated it. The internal scan script that preceded this release knew about proprietary services, internal network paths, and agent infrastructure. None of that belongs in a public utility.

What remains is deliberately narrow. Detect tools. Report versions. Gate on requirements. No version comparison engine, no auto-install, no dependency resolution. Those are different problems with different failure modes. The version format zoo alone makes comparison fragile enough to justify deferring it. The right first release is the one that does one thing reliably.

>_What comes next.

The immediate roadmap is distribution: PyPI publishing, so pip install devcap works without routing through GitHub. After that, the useful expansions are diff mode (compare two scan snapshots), HTML report output, and community-contributed tool definitions. The profile system already supports custom tools, so the registry can grow without touching the engine.

The governing instinct stays the same one that runs through every OpenForge release. If a tool solves a real problem, requires no proprietary infrastructure, and does not need to be closed, open it. The scan script that lived in a private repo for months is now a versioned, tested, documented package that anyone can run. That is the point.


Navigate Greyforge

This chronicle is part of the OpenForge release series. Each release comes with a tool, a rationale, and a clean boundary.

Continue Reading

The same design instincts behind devcap show up across the Greyforge architecture: bounded responsibilities, cheap controls, and explicit surfaces.