A four-layer reference for SOX as practiced — not as recited from a textbook. Each layer answers a different question the auditor must answer in the room. Edition 2026.2 expands every layer with the detail second-year SOX leads keep asking for.
Internal audit owns the program; external audit forms an opinion on it. Same lifecycle, different objectives, different independence rules, different consequences.
Scoping is the leverage point. Most second-year SOX programs over-scope (every system in IT) or under-scope (skip a sub-ledger because "it's small"). The discipline is the top-down approach in AS 2201 ¶21–34: financial-statement assertions, walk down to significant accounts, then to relevant assertions, then to controls. If you can't trace a control back to an assertion, it doesn't belong in scope.
Bridges between lanes are where audits succeed or fail. When IA and EA share the same scoping memo, walkthrough narratives, and deficiency log, the external audit gets cheaper and the management assertion gets stronger. When they don't, EA reperforms everything from scratch and management ends up with surprises in October.
Reliance on internal audit (AS 2605 / AS 2201 ¶16–¶19) is not automatic. The external auditor evaluates IA's competence (qualifications, training, supervision) and objectivity (organizational status, reporting line to AC, scope of activities). Even when both pass, the EA cannot rely on IA for high-risk areas — fraud, period-end close, ELCs at the principle level — and must perform sufficient direct testing.
The aggregation question is where careers turn. Three SDs in revenue recognition, individually rated significant, may aggregate to a material weakness. AS 2201 ¶68 says the auditor must consider whether deficiencies, "when considered in combination with other deficiencies affecting the same significant account or disclosure, relevant assertion, or component of internal control," constitute a material weakness. Many MWs in PCAOB inspection findings turn out to be aggregation calls that the firm got wrong — not single catastrophic failures.
The mindmap is hierarchical, not flat. Entity-level controls set the tone: if the audit committee doesn't function, no process control compensates. ITGCs are pervasive — they earn the right to rely on automated controls. If your access management or change management fails, every ITAC downstream is presumed unreliable, and you fall back on business process controls (manual reviews) to plug the gap, with much larger sample sizes.
This is why ITGC failures are rarely "one finding" — they cascade. A deficiency in privileged access review can disqualify the three-way-match ITAC, which forces the auditor to test the manual P.O. approval at population level instead of one item per quarter. The hours quintuple, the partner is unhappy, and the client is asked for evidence they don't have ready.
Anti-patterns are the pattern. The drill-down above lists the failure modes that show up in PCAOB inspection findings year after year. None are exotic. They are the boring controls that get checked off in green when the underlying evidence wouldn't survive a serious walkthrough. The most useful question a SOX lead can ask their team is: "If the auditor pulled this control's evidence today, what would actually be there?"
The four-eyes problem. "Reviewed by" is not a control. The reviewer must demonstrate what they reviewed, against what criteria, and what they would do if it failed. A signature on a recon with no tickmarks, no exception list, no challenge evidence is a deficient MRC — and MRCs are where most modern restatements actually originate.
IPE is the silent killer. Auditors get burned more often by reports than by controls. If management's review uses a system-generated aging report, you must test the report's completeness and accuracy. If the report parameters are user-modifiable, you may need to reperform the query yourself. The IPE diagram above is the protocol — skip any step and the test loses its evidentiary basis.
Sample sizes are floors, not ceilings. The matrix above is baseline guidance. For higher-risk controls, scale up. For controls with prior-year exceptions, increase coverage by at least 50%. For sample selection, use random or systematic — never haphazard. Document the selection method in the test memo or your sample is challengeable.
Workpapers are evidence too. AS 1215 requires that workpapers contain "sufficient information to enable an experienced auditor, having no previous connection with the engagement, to understand the nature, timing, extent, and results of the procedures performed, evidence obtained, and conclusions reached." If a stranger can't reperform the test from your workpaper, the workpaper is incomplete — even if the conclusion happens to be right.
| SOX domain | SOC 2 (TSC) | ISO 27001:2022 | NIST CSF 2.0 | PCI DSS v4.0.1 | HIPAA | HITRUST v11 | Shared evidence |
|---|---|---|---|---|---|---|---|
| Logical access — JML & reviews | CC6.1 · CC6.2 · CC6.3 |
A.5.15 · A.5.16 · A.5.18 · A.8.2 |
PR.AA-01 · PR.AA-05 |
Req 7 · Req 8 |
§164.308(a)(3) · §164.308(a)(4) |
01.b · 01.c · 01.v |
JML tickets, UAR exports, term tickets, IAM config |
| Privileged access & SoD | CC6.1 · CC6.3 |
A.5.15 · A.8.2 |
PR.AA-05 · PR.PS-01 |
Req 7.2 · Req 8.2 |
§164.308(a)(4) |
01.q · 01.v |
PAM logs, SoD ruleset, role-conflict report |
| Change management | CC8.1 |
A.8.32 · A.8.31 |
PR.PS-06 · ID.RA-07 |
Req 6.5 |
§164.308(a)(8) |
10.h · 09.b |
Change tickets, CAB minutes, PR approvals, deploy logs |
| SDLC & secure dev | CC8.1 |
A.8.25–A.8.28 |
PR.PS-06 |
Req 6.2 · Req 6.3 |
— | 10.a · 10.b |
SAST/DAST reports, code review records, pen test report |
| Backup, jobs & ops | A1.2 · CC7.2 |
A.8.13 · A.8.14 · A.8.16 |
PR.DS-11 · DE.CM-01 |
Req 10 · Req 12.10 |
§164.308(a)(7) |
09.l · 09.k |
Backup logs, restore-test evidence, scheduler runs |
| Incident management | CC7.3–CC7.5 |
A.5.24–A.5.28 |
RS.MA · RS.AN · RS.CO |
Req 12.10 |
§164.308(a)(6) |
11.a–11.c |
Incident tickets, post-incident reviews, comms log |
| Vendor / TPRM | CC9.2 |
A.5.19–A.5.23 |
GV.SC (full) |
Req 12.8 · Req 12.9 |
§164.308(b) · BAAs |
05.k |
Vendor inventory, due-diligence pkg, contracts, SOC 2 reports |
| Risk mgmt governance | CC3.1–CC3.4 |
Cl. 6.1 · A.5.1–A.5.4 |
GV.RM · ID.RA |
Req 12.3 |
§164.308(a)(1)(ii)(A–B) |
03.a · 03.c |
Risk register, RCSA, AC minutes, treatment plans |
| Encryption — at rest & in transit | CC6.1 · CC6.7 |
A.8.24 |
PR.DS-01 · PR.DS-02 |
Req 3 · Req 4 |
§164.312(a)(2)(iv) · §164.312(e)(2)(ii) |
10.f · 09.s |
KMS config, TLS scan, cert inventory, key rotation logs |
| Logging & monitoring | CC7.1 · CC7.2 |
A.8.15 · A.8.16 |
DE.CM-01 · DE.CM-09 |
Req 10 |
§164.308(a)(1)(ii)(D) · §164.312(b) |
09.aa · 09.ab |
SIEM rules, log retention config, alert tuning evidence |
| Vulnerability mgmt | CC7.1 |
A.8.8 |
ID.RA-01 · PR.PS-02 |
Req 6.3 · Req 11.3 |
§164.308(a)(1)(ii)(B) |
10.k · 10.m |
Scan reports, patch SLAs, exception register |
| BCDR / resilience | A1.2 · A1.3 |
A.5.29 · A.5.30 |
RC.RP · RC.CO |
Req 12.10 |
§164.308(a)(7) |
12.b · 12.c |
BIA, DR plan, last-test report, RTO/RPO documentation |
| Data classification | CC6.1 · C1.1 |
A.5.12 · A.5.13 |
ID.AM-07 |
Req 3.2 · Req 9.4 |
§164.514 |
06.c |
Data inventory, classification policy, labeling tool config |
| Physical security | CC6.4 |
A.7.1–A.7.14 |
PR.AA-06 · PR.IR-02 |
Req 9 |
§164.310 |
08.b · 08.j |
Badge logs, CCTV retention, visitor log |
| Awareness & training | CC1.4 |
A.6.3 |
PR.AT-01 · PR.AT-02 |
Req 12.6 |
§164.308(a)(5) |
02.e · 02.f |
Completion reports, phishing test results, attestations |
| Secure config baselines | CC6.6 · CC6.8 |
A.8.9 |
PR.PS-01 |
Req 2 |
§164.312(a)(1) |
09.h · 10.b |
CIS benchmark reports, hardening guides, drift detection |
| Vendor offboarding | CC9.2 |
A.5.20 · A.5.22 |
GV.SC-04 |
Req 12.8.4 |
BAA termination clauses | 05.k |
Termination ticket, data-return cert, access-removal log |
| HR / personnel security | CC1.4 · CC1.5 |
A.6.1 · A.6.2 · A.6.4 |
PR.AA-04 |
Req 12.7 |
§164.308(a)(3) |
02.a · 02.b · 02.c |
Background-check records, NDA, sanction policy |
The crosswalk is not a curiosity — it is the blueprint for a unified evidence model. When access reviews are mapped to SOX ITGC, SOC 2 CC6.3, ISO A.5.18, NIST PR.AA-05, PCI Req 7, HIPAA §164.308(a)(4), and HITRUST 01.v simultaneously, you collect one evidence package and satisfy seven audits. This is the foundational move of GRC engineering.
The trap is treating the mappings as identities. They are overlaps, not equivalencies. SOX cares about financial reporting reliability; SOC 2 cares about service commitments; HIPAA cares about ePHI protection; PCI cares about cardholder data. The same access review will be tested with different sample selection criteria, different population definitions, and different deficiency thresholds. Crosswalk to collect evidence once. Test it through each framework's lens.
The shared-evidence column is where this gets actionable. If you can name the ticket, the report, the screenshot, the log file that satisfies the control — and you can produce it on demand, with IPE testing applied — you've built a control that travels across frameworks. That's the unit of work in modern GRC engineering.