Skip to content

Delivery Checklist

Use these checklists to ensure consistent approvals across projects. Copy them into your own workflow document, or use them as a mental framework during review.

Fast delivery pass

A quick sanity check when you need rapid confidence. This should catch obvious problems but won't surface subtle issues.

CheckWhat to look for
☐ Programme info matchesDoes the programme name and total duration match the delivery notes? If not, you may have the wrong file or version.
☐ Beds map correctlyIn Multichannel waveform view, do bed channels show activity on the expected speakers? A 7.1 bed should light up L, R, C, LFE, Ls, Rs, Lrs, Rrs.
☐ Objects appear and moveIn the 3D view, do objects show up at their expected positions? Do they move when they should? Any stuck or missing objects?
☐ No clipping or unexpected peaksWatch the meters during the loudest sections. Are levels healthy? Any red peaks or obvious clipping?
☐ Stereo fold-down is balancedSwitch to Stereo mode. Are essential elements still audible? Does the balance feel intentional, or are key parts masked?

If all boxes check out, you have basic confidence. If something fails, dig deeper with the full review checklist below.

Spatial translation check

Validates that the mix translates well across monitoring environments. Run this after the fast pass.

CheckWhat to look for
☐ 7.1.4 positions are correctIn speaker mode, do objects localize where you expect them? Front elements in front, rear in rear, heights elevated?
☐ Binaural preserves front/backIn Binaural mode, can you clearly distinguish front from back? Do rear elements actually sound behind you?
☐ Height cues are audibleDo overhead elements feel elevated in binaural, or do they collapse to ear level?
☐ Center image is stableIs the center (dialogue, lead vocals) focused and stable, or does it wander or feel diffuse?
☐ Panning feels smoothDo moving objects transition smoothly, or are there jumps and discontinuities?

Bed and object validation

A more detailed check of the ADM content structure.

CheckWhat to look for
☐ Bed format matches specDoes the bed format badge in the sidebar match what the delivery notes specify (e.g., 7.1.2, 5.1)?
☐ Expected number of objectsDoes the object count match expectations? Missing objects could indicate an export issue.
☐ LFE content is isolatedIn Multichannel view, does LFE content appear only on the LFE channel? Positional content should not bleed into LFE.
☐ Height channels are activeIf the mix uses height, do Ltf/Rtf/Ltr/Rtr show activity during overhead moments?
☐ Object naming is clearAre objects named in a way that makes sense for the content? (This is informational — naming doesn't affect playback.)

Loudness and levels

Validates compliance with loudness specifications.

CheckWhat to look for
☐ Integrated loudness matches targetExpand the Loudness panel and check Integrated LUFS. Does it match the delivery spec (e.g., -24 LUFS for broadcast, -14 LUFS for streaming)?
☐ True Peak is within limitsCheck True Peak in the Loudness panel. Most specs require -1 dBTP or -2 dBTP maximum.
☐ No unexpected dynamic shiftsPlay through the programme. Are there sudden loudness jumps that feel unintentional?
☐ ADM metadata matches measurementsCompare Programme loudness (from ADM metadata) with Session loudness (live measurement). They should be close.

Final sign-off

The last checks before approving a deliverable.

CheckWhat to look for
☐ Start is cleanPlay the first few seconds. Does audio begin where expected? Any clicks, pops, or premature starts?
☐ End is cleanPlay the final minute. Does the programme end cleanly? Any cut-off tails or unexpected silence?
☐ No muted groupsClear all solos and mutes. Confirm nothing was accidentally left isolated during review.
☐ Notes capturedHave you documented any issues with timecodes and object names?
☐ Confidence levelBased on your review, are you confident this deliverable is ready?

Using these checklists

Be consistent. Use the same checklist across projects to avoid subjective drift. What passed last week should pass this week.

Document everything. When you find issues, capture:

  • The timecode where the issue occurs
  • Which object or bed is affected
  • A brief description of what's wrong
  • Which monitoring mode revealed the issue

Know when to escalate. Some issues are clear failures (clipping, missing objects). Others are judgment calls (is that height move subtle or missing?). When in doubt, document and discuss with the mix team.

Build your own checklist

These checklists cover common scenarios, but your projects may have specific requirements. Add checks for dialogue clarity, music stem balance, or format-specific requirements as needed.

Orbit user documentation