Skip to content

Definition-of-Done per Artifact

Intent

Define Done for each artifact type to prevent “nearly done” outputs.

Structure

Definition-of-Done (DoD) per artifact is a set of versioned checklists maintained in-repo and referenced by PR templates. Structurally it standardizes what “complete” means across profiles, terminology artifacts, examples, pages, and tests.

  • Checklist catalog: one checklist per artifact type with objective pass/fail criteria
  • PR integration: PR templates require authors to affirm relevant DoD items
  • Ownership: explicit reviewer roles for each checklist section (clinical/term/tech/editorial)
  • Automation mapping: which DoD items are enforced by CI vs human review
  • Release gate: a final roll-up check that confirms DoD compliance for the release set

Key Components

DoD checklists

  • Create per-artifact checklists (Profile, ValueSet, ConceptMap, Example, Page)
  • Include validation + rendering requirements
  • Include mandatory metadata (title, purpose, narrative)
  • Include required tests or vectors
  • Keep checklists versioned in the repo

Review roles

  • Define who reviews what: clinical, terminology, technical, editorial
  • Ensure reviewers know what ‘good’ looks like
  • Avoid requiring everyone to review everything
  • Require a minimum set of sign-offs for risky changes
  • Track reviewer assignment in PR templates

Automated checks

  • Run IG Publisher validation as a gate
  • Run terminology tests against a pinned server
  • Enforce formatting/linting rules to reduce review noise
  • Fail fast on broken links or missing metadata
  • Keep checks stable and document suppressions

Release readiness gate

  • Define release criteria (all critical issues closed, package published)
  • Require changelog + migration note updates
  • Confirm versioning is correct and dependencies resolve
  • Run a minimal cross-product smoke suite
  • Time-box final review to avoid stalled releases

Behavior

DoD enforcement and evolution

DoD only works when it’s routinely applied and refined based on real failures. The behavior is to treat DoD as a living system constraint.

Apply

  • Use the checklist to define what ‘DONE’ means before work starts.
  • Reject ‘almost done’ changes that don’t meet objective criteria.
  • Automate repetitive checks so reviews focus on meaning and usability.

Refine

  • When releases are blocked, identify which missing criteria caused the block.
  • Adjust the checklist to prevent repeats (add an automated check if possible).
  • Version the DoD changes and communicate them to contributors.

Benefits

  • Predictable throughput
  • Fewer regressions

Trade-offs

  • Over-strict DoD can stall

Example

ValueSet DoD: intent + compose + test vectors + at least one bound example.