Volume I · Sarbanes–Oxley · Edition 2026.2

The Compliance Atlas

Authoritative refs
PCAOB AS 2201 · AS 2315 · AS 1215
COSO 2013 · COBIT 2019 · §302/§404
Verified May 12, 2026

A four-layer reference for SOX as practiced — not as recited from a textbook. Each layer answers a different question the auditor must answer in the room. Edition 2026.2 expands every layer with the detail second-year SOX leads keep asking for.

Reading the Atlas

Internal Audit lane External Audit lane Bridge / hand-off

Internal audit owns the program; external audit forms an opinion on it. Same lifecycle, different objectives, different independence rules, different consequences.

I.
Layer 01 — Lifecycle, twelve months walked

The audit, walked side by side

Top-down approach
PCAOB AS 2201 ¶21–¶34

The 12-month integrated audit — internal & external in parallel

MONTHS FROM YEAR-END → −9 mo −7 mo −5 mo (Q3 interim) −3 mo −1 mo (roll-fwd) YEAR-END +1 mo (close) +2 mo (10-K) INTERNAL AUDIT — OWNS THE PROGRAM EXTERNAL AUDIT — FORMS THE OPINION Annual SOX risk assessment CFO · IA · CAE Scoping memo entities · accts · sys in coordination w/ EA Control owner kickoff RACI · timelines · PBC Narrative refresh walkthrough scripts flowcharts · risk-control Walkthroughs design effectiveness "one-and-done" per ctrl Interim TOE samples · exceptions 9–12 mo period Q3 sub-cert CEO/CFO §302 supports 10-Q filing Deficiency log aggregation begins CD/SD/MW classification Remediation owners · target dates retest evidence required Roll-forward TOE Q4 testing to year-end coverage Mgmt §404(a) ICFR assertion supports CEO/CFO §302 AC pre-read findings · KPIs trend year-over-year Audit Committee report YE results · open items PCAOB AS 1301-aligned Lessons learned retro · process maturity Independence + engagement letter PCAOB Rule 3500T Risk assessment fraud · F/S assertions AS 2110 Materiality set planning · performance tolerable misstmt Top-down scoping ELC → accts → assertions AS 2201 ¶21–34 Test of design walkthrough each KC AS 2201 ¶42–44 Interim TOE independent reperformance + reliance on IA work Year-end TOE + roll-fwd close + assertion testing Reliance on IA competence · objectivity AS 2605 / AS 2201 ¶16–19 Service org reports SOC 1 review · CUEC AS 2601 Quarterly reviews 10-Q sign-off support AS 4105 Deficiency eval severity · aggregation AS 2201 ¶62–70 Subsequent events Type I & Type II review AS 2801 Mgmt rep letter CEO/CFO acknowledged AS 2805 Opinion + AC comms §404(b) ICFR + F/S AS 1301 WORKPAPER ASSEMBLY · RETENTION 7 YRS PCAOB AS 1215 — sign-off, archival within 45 days of report release scoping memo shared EA reviews / agrees walkthrough narratives + PBC list IA workpapers → EA reliance assessment deficiency log discussed weekly Q4 mgmt §404(a) + rep letter → EA opinion solid box · primary procedure dashed · follow-on / parallel

Deficiency severity — the AS 2201 ¶62–70 decision tree

Identified deficiency design failure or operating exception Q1 · Could a misstatement occur? likelihood: reasonably possible / remote AS 2201 ¶64 Q2 · What's the magnitude? vs. materiality & performance materiality AS 2201 ¶65 Control Deficiency Less severe than SD. Track, remediate, retest. no escalation required Significant Deficiency Less severe than MW, but important enough to warrant attention by AC. communicate to AC in writing Material Weakness Reasonable possibility a material misstatement will not be prevented or detected on a timely basis. adverse ICFR opinion · 8-K Item 4.02 if restate Aggregation matters: multiple SDs in the same area can rise to MW (¶68). Compensating controls assessed at ¶67.

Where lifecycle work goes off the rails

Scoping is the leverage point. Most second-year SOX programs over-scope (every system in IT) or under-scope (skip a sub-ledger because "it's small"). The discipline is the top-down approach in AS 2201 ¶21–34: financial-statement assertions, walk down to significant accounts, then to relevant assertions, then to controls. If you can't trace a control back to an assertion, it doesn't belong in scope.

Bridges between lanes are where audits succeed or fail. When IA and EA share the same scoping memo, walkthrough narratives, and deficiency log, the external audit gets cheaper and the management assertion gets stronger. When they don't, EA reperforms everything from scratch and management ends up with surprises in October.

Reliance on internal audit (AS 2605 / AS 2201 ¶16–¶19) is not automatic. The external auditor evaluates IA's competence (qualifications, training, supervision) and objectivity (organizational status, reporting line to AC, scope of activities). Even when both pass, the EA cannot rely on IA for high-risk areas — fraud, period-end close, ELCs at the principle level — and must perform sufficient direct testing.

A deficiency caught in March by IA is a remediation conversation; the same deficiency caught in November by EA is a §302 disclosure conversation.

The aggregation question is where careers turn. Three SDs in revenue recognition, individually rated significant, may aggregate to a material weakness. AS 2201 ¶68 says the auditor must consider whether deficiencies, "when considered in combination with other deficiencies affecting the same significant account or disclosure, relevant assertion, or component of internal control," constitute a material weakness. Many MWs in PCAOB inspection findings turn out to be aggregation calls that the firm got wrong — not single catastrophic failures.

II.
Layer 02 — Control universe

The four families, drilled to test approach

COSO 2013 · 17 principles
COBIT 2019 · ITGC mapping

The four families and their relationships

SOX control universe FAMILY · ELC Entity-Level Controls Tone, governance, fraud risk, whistleblower, period-end close oversight, MRCs at entity layer. COSO 2013 — all 5 components, principles 1–17 Examples: AC charter · Code of Ethics · MRC Test: AC minutes · whistleblower log · attestations Frequency: continuous + annual review Failure → adverse ICFR opinion almost always FAMILY · ITGC IT General Controls Four pillars: Access · Change · Operations · SDLC. Pervasive — affect reliability of every ITAC. Maps to: COBIT DSS / BAI / APO domains Scoped per system: GL · subledger · reporting Test: configuration · ticket sample · UAR Frequency: pillar-specific (see drill-down below) Failure → cascade to ITAC reliance, BPC scale-up FAMILY · BPC Business Process Manual reviews, reconciliations, journal-entry approvals — humans verifying outputs. By cycle: O2C · P2P · R2R · HR · Tax · Tsy Often "MRC" — Management Review Controls Test: review evidence · tickmarks · challenge Frequency: daily/weekly/monthly Failure → most common source of restatements FAMILY · ITAC IT Application Controls Automated configurations inside the application — 3-way match, tolerance limits, edit checks, etc. Reliance on ITGCs is required Input · Processing · Output validation Test: configuration screen · benchmark · 1 sample Benefit: highest leverage — 1 test = full pop coverage Caveat: ITGC failure invalidates the ITAC reliance ACCESS JML · UAR · SoD · PAM CHANGE CAB · UAT · prod-segr OPERATIONS backup · jobs · DR SDLC design · deploy

ITGC drill-down — the four pillars, sub-controls, and anti-patterns

Access Joiner / Mover / Leaver request → approval → provision → audit log User Access Reviews (UAR) Quarterly · entitlement-level · with exception ev. Privileged Access Management PAM tool · session recording · break-glass Segregation of Duties role conflict matrix · ruleset · monitoring Generic / shared IDs prohibited or compensating control required Termination & transfer SLA: 24h for sensitive, 72h general ANTI-PATTERNS ↓ — "Reviewed" w/ no exception evidence — UAR run on stale entitlement extract — Generic IDs labelled "service account" — Privileged role granted but never reviewed — Termination ticket but access still active Change Change request & CAB CAB approval · risk classification · rollback plan UAT / QA evidence test cases · sign-off · defect log Production deployment deployer ≠ developer · audit trail Emergency changes retrospective approval · post-impl review Database changes DBA changes logged · DDL controls Direct production access restricted · monitored · break-glass only ANTI-PATTERNS ↓ — Tickets approved retroactively — Developer with prod deploy rights — Emergency changes never reviewed — No UAT for "minor" config changes — DBA bypass not monitored Operations Job scheduling & monitoring scheduler config · failure alerts · resolution Backup & restore backup success · restore test (not just plan) Incident management ticketing · severity · root cause · closure Disaster recovery RTO/RPO defined · DR test annually · evidence Capacity / monitoring thresholds · alerts · response procedure Patch management SLA by severity · evidence of deployment ANTI-PATTERNS ↓ — Backups run but never tested-restore — Job failures auto-cleared, no review — Incident closed w/o root cause — DR plan exists, never executed — Patch SLAs missed for critical vulns SDLC Design / requirements business sign-off · risk assessment Code review PR review · enforced via branch protection Security testing SAST · DAST · dependency scan · pen test Data conversion migration cert · reconciliation evidence Cutover & go-live go-no-go · post-impl review · stabilization Retirement / decom data retention · access removal ANTI-PATTERNS ↓ — Code reviews "self-approved" — Pen test scope skips SOX systems — Conversion w/o reconciliation — Go-live before UAT signed — Retired system access not removed CASCADE LOGIC A failure in any pillar disqualifies reliance on automated controls within that scope. Manual BPCs must compensate — usually at population scale.

How the four families talk to each other

The mindmap is hierarchical, not flat. Entity-level controls set the tone: if the audit committee doesn't function, no process control compensates. ITGCs are pervasive — they earn the right to rely on automated controls. If your access management or change management fails, every ITAC downstream is presumed unreliable, and you fall back on business process controls (manual reviews) to plug the gap, with much larger sample sizes.

This is why ITGC failures are rarely "one finding" — they cascade. A deficiency in privileged access review can disqualify the three-way-match ITAC, which forces the auditor to test the manual P.O. approval at population level instead of one item per quarter. The hours quintuple, the partner is unhappy, and the client is asked for evidence they don't have ready.

ITGCs are not a checklist. They are the foundation that makes everything above them auditable.

Anti-patterns are the pattern. The drill-down above lists the failure modes that show up in PCAOB inspection findings year after year. None are exotic. They are the boring controls that get checked off in green when the underlying evidence wouldn't survive a serious walkthrough. The most useful question a SOX lead can ask their team is: "If the auditor pulled this control's evidence today, what would actually be there?"

III.
Layer 03 — Evidence

What you ask for, how many, how to test it

AICPA AAG Audit Sampling
PCAOB AS 2315

Sample-size matrix — by frequency & risk tier

CONTROL TYPE FREQUENCY LOW RISK MOD RISK HIGH RISK PRIMARY EVIDENCE TECHNIQUE Automated configuration · ITAC Continuous runs every event 1 1 1 + ITGC reliance + benchmark Configuration screenshot, change-history report Inspection + Reperformance Manual — Daily e.g., cash recon Daily (~250/yr) 25 40 60 Recon w/ reviewer sign-off, supporting reports Inspection + Inquiry Manual — Weekly Weekly (~52/yr) 15 25 40 Payroll reg review, vendor recon Inspection Manual — Monthly Monthly (12/yr) 2 3–5 6 Flux analysis, MRC, J/E reviews Inspection + corroborative inquiry Manual — Qtrly 4/yr 2 2 3 UAR for SOX systems Inspection Annual 1/yr 1 1 1 Policy attestation, DR test Inspection Event-driven terminations · changes Per occurrence 10% 15% 25% min 25 each, scaled to pop Tickets, approvals, term logs Insp + Reperform

Information Produced by Entity (IPE) — the silent evidence killer

Report used as audit evidence e.g., aging, GL extract, UAR list 1 · Source where did the data come from? 2 · Parameters filters, dates, exclusions used 3 · Completeness all records that should be there 4 · Accuracy field values match source system PROCEDURES Trace report → source system query Inspect report header / re-run query if user-modifiable, reperform Tie totals to GL or other independent source test for unexpected exclusions Sample records, agree to source records field-by-field comparison CONSEQUENCES OF UNTESTED IPE → The control test loses its evidence basis. → Auditor must reperform with independent data — usually 5x the time. → Repeated IPE failures = ITGC deficiency in reporting / monitoring. → Restatements often trace back to a trusted-but-untested report. "What were the parameters?" — the question every IPE test answers If management's review uses a system-generated report, the report itself becomes part of the control. Test the control AND the report.

Workpaper structure — what every test should look like

Lead sheet control ID · description owner · frequency conclusion · sign-off links to all below Test memo objective · risk addressed population · sample method test steps performed conclusion w/ rationale Sample & evidence selection workbook tickmarks · screenshots IPE tests attached PBC reference numbers Exception schedule items failed root cause analysis mgmt response severity assessment Retest if exceptions fresh sample from same period post-remediation required for sign-off If any of these is missing, the workpaper is not done — it's a draft.

Where evidence work breaks down

The four-eyes problem. "Reviewed by" is not a control. The reviewer must demonstrate what they reviewed, against what criteria, and what they would do if it failed. A signature on a recon with no tickmarks, no exception list, no challenge evidence is a deficient MRC — and MRCs are where most modern restatements actually originate.

IPE is the silent killer. Auditors get burned more often by reports than by controls. If management's review uses a system-generated aging report, you must test the report's completeness and accuracy. If the report parameters are user-modifiable, you may need to reperform the query yourself. The IPE diagram above is the protocol — skip any step and the test loses its evidentiary basis.

Sample sizes are floors, not ceilings. The matrix above is baseline guidance. For higher-risk controls, scale up. For controls with prior-year exceptions, increase coverage by at least 50%. For sample selection, use random or systematic — never haphazard. Document the selection method in the test memo or your sample is challengeable.

Workpapers are evidence too. AS 1215 requires that workpapers contain "sufficient information to enable an experienced auditor, having no previous connection with the engagement, to understand the nature, timing, extent, and results of the procedures performed, evidence obtained, and conclusions reached." If a stranger can't reperform the test from your workpaper, the workpaper is incomplete — even if the conclusion happens to be right.

IV.
Layer 04 — Cross-framework

The same control, tested by seven frameworks

Use this as the seed for a
unified evidence-collection model
SOX domain SOC 2 (TSC) ISO 27001:2022 NIST CSF 2.0 PCI DSS v4.0.1 HIPAA HITRUST v11 Shared evidence
Logical access — JML & reviews CC6.1 · CC6.2 · CC6.3 A.5.15 · A.5.16 · A.5.18 · A.8.2 PR.AA-01 · PR.AA-05 Req 7 · Req 8 §164.308(a)(3) · §164.308(a)(4) 01.b · 01.c · 01.v JML tickets, UAR exports, term tickets, IAM config
Privileged access & SoD CC6.1 · CC6.3 A.5.15 · A.8.2 PR.AA-05 · PR.PS-01 Req 7.2 · Req 8.2 §164.308(a)(4) 01.q · 01.v PAM logs, SoD ruleset, role-conflict report
Change management CC8.1 A.8.32 · A.8.31 PR.PS-06 · ID.RA-07 Req 6.5 §164.308(a)(8) 10.h · 09.b Change tickets, CAB minutes, PR approvals, deploy logs
SDLC & secure dev CC8.1 A.8.25–A.8.28 PR.PS-06 Req 6.2 · Req 6.3 10.a · 10.b SAST/DAST reports, code review records, pen test report
Backup, jobs & ops A1.2 · CC7.2 A.8.13 · A.8.14 · A.8.16 PR.DS-11 · DE.CM-01 Req 10 · Req 12.10 §164.308(a)(7) 09.l · 09.k Backup logs, restore-test evidence, scheduler runs
Incident management CC7.3–CC7.5 A.5.24–A.5.28 RS.MA · RS.AN · RS.CO Req 12.10 §164.308(a)(6) 11.a–11.c Incident tickets, post-incident reviews, comms log
Vendor / TPRM CC9.2 A.5.19–A.5.23 GV.SC (full) Req 12.8 · Req 12.9 §164.308(b) · BAAs 05.k Vendor inventory, due-diligence pkg, contracts, SOC 2 reports
Risk mgmt governance CC3.1–CC3.4 Cl. 6.1 · A.5.1–A.5.4 GV.RM · ID.RA Req 12.3 §164.308(a)(1)(ii)(A–B) 03.a · 03.c Risk register, RCSA, AC minutes, treatment plans
Encryption — at rest & in transit CC6.1 · CC6.7 A.8.24 PR.DS-01 · PR.DS-02 Req 3 · Req 4 §164.312(a)(2)(iv) · §164.312(e)(2)(ii) 10.f · 09.s KMS config, TLS scan, cert inventory, key rotation logs
Logging & monitoring CC7.1 · CC7.2 A.8.15 · A.8.16 DE.CM-01 · DE.CM-09 Req 10 §164.308(a)(1)(ii)(D) · §164.312(b) 09.aa · 09.ab SIEM rules, log retention config, alert tuning evidence
Vulnerability mgmt CC7.1 A.8.8 ID.RA-01 · PR.PS-02 Req 6.3 · Req 11.3 §164.308(a)(1)(ii)(B) 10.k · 10.m Scan reports, patch SLAs, exception register
BCDR / resilience A1.2 · A1.3 A.5.29 · A.5.30 RC.RP · RC.CO Req 12.10 §164.308(a)(7) 12.b · 12.c BIA, DR plan, last-test report, RTO/RPO documentation
Data classification CC6.1 · C1.1 A.5.12 · A.5.13 ID.AM-07 Req 3.2 · Req 9.4 §164.514 06.c Data inventory, classification policy, labeling tool config
Physical security CC6.4 A.7.1–A.7.14 PR.AA-06 · PR.IR-02 Req 9 §164.310 08.b · 08.j Badge logs, CCTV retention, visitor log
Awareness & training CC1.4 A.6.3 PR.AT-01 · PR.AT-02 Req 12.6 §164.308(a)(5) 02.e · 02.f Completion reports, phishing test results, attestations
Secure config baselines CC6.6 · CC6.8 A.8.9 PR.PS-01 Req 2 §164.312(a)(1) 09.h · 10.b CIS benchmark reports, hardening guides, drift detection
Vendor offboarding CC9.2 A.5.20 · A.5.22 GV.SC-04 Req 12.8.4 BAA termination clauses 05.k Termination ticket, data-return cert, access-removal log
HR / personnel security CC1.4 · CC1.5 A.6.1 · A.6.2 · A.6.4 PR.AA-04 Req 12.7 §164.308(a)(3) 02.a · 02.b · 02.c Background-check records, NDA, sanction policy

Why this is the highest-leverage page in the Atlas

The crosswalk is not a curiosity — it is the blueprint for a unified evidence model. When access reviews are mapped to SOX ITGC, SOC 2 CC6.3, ISO A.5.18, NIST PR.AA-05, PCI Req 7, HIPAA §164.308(a)(4), and HITRUST 01.v simultaneously, you collect one evidence package and satisfy seven audits. This is the foundational move of GRC engineering.

The trap is treating the mappings as identities. They are overlaps, not equivalencies. SOX cares about financial reporting reliability; SOC 2 cares about service commitments; HIPAA cares about ePHI protection; PCI cares about cardholder data. The same access review will be tested with different sample selection criteria, different population definitions, and different deficiency thresholds. Crosswalk to collect evidence once. Test it through each framework's lens.

One control. One evidence pull. Seven audits. Seven tests. Don't confuse the engineering win for an audit win.

The shared-evidence column is where this gets actionable. If you can name the ticket, the report, the screenshot, the log file that satisfies the control — and you can produce it on demand, with IPE testing applied — you've built a control that travels across frameworks. That's the unit of work in modern GRC engineering.