Validate OSIRIS JSON documents
OSIRIS JSON validation is no longer just an implied part of the specification. It now has a documented model, a canonical engine boundary, and a CLI contract designed for real tooling.
The current validation guidance is spread across three draft documents published in February 2026: the Validation Levels Guidelines, the Toolbox Core internals guide, and the Toolbox CLI guide. Together, they define how OSIRIS documents should be validated consistently across local development, editor integrations, and CI pipelines.
Validation is more than schema
One of the clearest decisions in the OSIRIS validation model is that JSON Schema alone is not enough.
The project defines validation as a three-level pipeline:
- Level 1: Structural validation checks whether a document conforms to the OSIRIS schema
- Level 2: Semantic validation checks graph integrity such as dangling references, duplicate IDs, and invalid group hierarchies
- Level 3: Domain validation adds optional best-practice guidance that improves interoperability without redefining what counts as valid OSIRIS
That separation matters. A document can be structurally correct and still be unsafe to consume if its topology cannot be trusted. OSIRIS treats those as different problems and gives tooling a cleaner way to report them.
A shared engine, not competing validators
The validation architecture is built around a single rule: reference tooling should not invent its own version of truth.
@osirisjson/core is defined as the canonical validation engine for the ecosystem. It is the component that owns the three-stage pipeline, the diagnostic model, schema routing, and the profile logic. The CLI and editor integrations are expected to delegate validation to that shared engine rather than re-implementing rules independently.
That is a strong architectural choice, and it is the right one. It means the same document should produce the same outcome whether it is checked in CI, inspected in an editor, or processed by a higher-level tool built on the OSIRIS toolbox.
Diagnostics are the contract
Another important design choice is that OSIRIS treats diagnostics as a structured interface, not as loose text output.
Validation findings are built around stable V-* codes, severities, messages, and JSON Pointer paths. In practice, that means:
- The code is the stable fact
- The severity is policy shaped by the active profile
- The message can evolve for clarity without breaking tooling
This is the difference between validation that is merely visible and validation that can actually be automated. CLI pipelines, editor hovers, machine-readable reports, and future quick fixes all depend on that separation.
Built for local work and CI
The CLI guidance is especially pragmatic. Validation is designed to be deterministic, offline-first, and shell-friendly.
The documented contract expects:
stdinsupport for pipeline composition- machine-readable JSON output for CI and automation
- stable exit codes that separate validation failures from operational failures
- offline schema resolution using bundled or cached schemas rather than live network fetches
The profiles are equally practical:
basicfocuses on structural checks onlydefaultruns structural and semantic validation for day-to-day workstrictadds selected domain checks for producer releases and CI gates
That gives OSIRIS a validation model that can scale from editor feedback to batch validation without changing the meaning of the rules underneath.
Why this matters now
Without a shared validation model, an open standard starts drifting almost immediately. Each tool ends up with different assumptions, different severities, different parser behavior, and eventually different definitions of what “valid” means.
These validation and toolbox guidelines are important because they try to stop that drift before it starts. They make validation reproducible, teachable, and portable across the ecosystem. Just as important, they keep privacy and predictability in scope by requiring local validation behavior and avoiding network dependency during checks.
Read the validation guidance
The current validation and toolbox documents are published as draft guidance: