Delivery Checklist
Use these checklists to ensure consistent approvals across projects. Copy them into your own workflow document, or use them as a mental framework during review.
Fast delivery pass
A quick sanity check when you need rapid confidence. This should catch obvious problems but won't surface subtle issues.
| Check | What to look for |
|---|---|
| ☐ Programme info matches | Does the programme name and total duration match the delivery notes? If not, you may have the wrong file or version. |
| ☐ Beds map correctly | In Multichannel waveform view, do bed channels show activity on the expected speakers? A 7.1 bed should light up L, R, C, LFE, Ls, Rs, Lrs, Rrs. |
| ☐ Objects appear and move | In the 3D view, do objects show up at their expected positions? Do they move when they should? Any stuck or missing objects? |
| ☐ No clipping or unexpected peaks | Watch the meters during the loudest sections. Are levels healthy? Any red peaks or obvious clipping? |
| ☐ Stereo fold-down is balanced | Switch to Stereo mode. Are essential elements still audible? Does the balance feel intentional, or are key parts masked? |
If all boxes check out, you have basic confidence. If something fails, dig deeper with the full review checklist below.
Spatial translation check
Validates that the mix translates well across monitoring environments. Run this after the fast pass.
| Check | What to look for |
|---|---|
| ☐ 7.1.4 positions are correct | In speaker mode, do objects localize where you expect them? Front elements in front, rear in rear, heights elevated? |
| ☐ Binaural preserves front/back | In Binaural mode, can you clearly distinguish front from back? Do rear elements actually sound behind you? |
| ☐ Height cues are audible | Do overhead elements feel elevated in binaural, or do they collapse to ear level? |
| ☐ Center image is stable | Is the center (dialogue, lead vocals) focused and stable, or does it wander or feel diffuse? |
| ☐ Panning feels smooth | Do moving objects transition smoothly, or are there jumps and discontinuities? |
Bed and object validation
A more detailed check of the ADM content structure.
| Check | What to look for |
|---|---|
| ☐ Bed format matches spec | Does the bed format badge in the sidebar match what the delivery notes specify (e.g., 7.1.2, 5.1)? |
| ☐ Expected number of objects | Does the object count match expectations? Missing objects could indicate an export issue. |
| ☐ LFE content is isolated | In Multichannel view, does LFE content appear only on the LFE channel? Positional content should not bleed into LFE. |
| ☐ Height channels are active | If the mix uses height, do Ltf/Rtf/Ltr/Rtr show activity during overhead moments? |
| ☐ Object naming is clear | Are objects named in a way that makes sense for the content? (This is informational — naming doesn't affect playback.) |
Loudness and levels
Validates compliance with loudness specifications.
| Check | What to look for |
|---|---|
| ☐ Integrated loudness matches target | Expand the Loudness panel and check Integrated LUFS. Does it match the delivery spec (e.g., -24 LUFS for broadcast, -14 LUFS for streaming)? |
| ☐ True Peak is within limits | Check True Peak in the Loudness panel. Most specs require -1 dBTP or -2 dBTP maximum. |
| ☐ No unexpected dynamic shifts | Play through the programme. Are there sudden loudness jumps that feel unintentional? |
| ☐ ADM metadata matches measurements | Compare Programme loudness (from ADM metadata) with Session loudness (live measurement). They should be close. |
Final sign-off
The last checks before approving a deliverable.
| Check | What to look for |
|---|---|
| ☐ Start is clean | Play the first few seconds. Does audio begin where expected? Any clicks, pops, or premature starts? |
| ☐ End is clean | Play the final minute. Does the programme end cleanly? Any cut-off tails or unexpected silence? |
| ☐ No muted groups | Clear all solos and mutes. Confirm nothing was accidentally left isolated during review. |
| ☐ Notes captured | Have you documented any issues with timecodes and object names? |
| ☐ Confidence level | Based on your review, are you confident this deliverable is ready? |
Using these checklists
Be consistent. Use the same checklist across projects to avoid subjective drift. What passed last week should pass this week.
Document everything. When you find issues, capture:
- The timecode where the issue occurs
- Which object or bed is affected
- A brief description of what's wrong
- Which monitoring mode revealed the issue
Know when to escalate. Some issues are clear failures (clipping, missing objects). Others are judgment calls (is that height move subtle or missing?). When in doubt, document and discuss with the mix team.
Build your own checklist
These checklists cover common scenarios, but your projects may have specific requirements. Add checks for dialogue clarity, music stem balance, or format-specific requirements as needed.
