Volume II · SOC 2 · Edition 2026.1

The Compliance Atlas

Authoritative refs
AICPA TSP-100 (2017 + 2022 PoF)
SSAE 18 · AT-C 105 · AT-C 205
Verified May 12, 2026

SOC 2 is the audit auditors least understand on first contact. It is not a checklist, not a certification, and not a guarantee. It is an examination engagement in which a CPA firm forms an opinion on whether your controls met the Trust Services Criteria you elected, over a period you defined, applied to a system you described.

Reading the Atlas

Internal — readiness & ops External — CPA firm Bridge / hand-off

SOC 2 has no regulator. The pressure comes from your customers' procurement teams. The discipline comes from your CPA firm under SSAE 18 and the AICPA's attestation standards.

I.
Layer 01 — Lifecycle

Readiness & examination, walked side by side

SSAE 18 · AT-C 105 / 205
SOC 2 Guide (AICPA, Nov 2022)

The full SOC 2 cycle — from readiness to bridge letter

PROGRAM TIMELINE → Year 0 — readiness Pre-audit Type 1 (point-in-time) Period start During period Period end → report Bridge / next year INTERNAL — OWNS THE CONTROL ENVIRONMENT EXTERNAL — CPA EXAMINATION ENGAGEMENT Scope & criteria selection Sec + ? Avail / PI / Conf / Privacy Gap assessment vs. TSC + PoF readiness assessor optional Control implementation policies · tools · ev. trail most of the work happens here System Description Section III draft DC §200 disclosure Type 1 walkthrough design effectiveness point-in-time Operating during period 3–12 months evidence accrual continuous Mgmt assertion Section II — signed CEO/CISO sign-off CUEC inventory customer responsibilities disclosed to user entities Sub-service orgs carve-out vs inclusive vendor SOC 2s collected Evidence collection automated where possible Vanta · Drata · Secureframe Vendor mgmt TPRM cadence CC9.2 evidence stream Quarterly mini-reviews control health checks catch drift before audit Mgmt response to test exceptions Section V if elected Bridge letter gap-period attestation customer asks; you provide Firm selection CPA · independence AT-C 105 ¶15 Engagement letter scope · TSCs · period AT-C 205 ¶09 Planning & risk understand the system AT-C 205 ¶17–25 Walkthrough test of design Type 1: stops here Test of operating over the period Type 2 only Evaluate exceptions qual + quant AT-C 205 ¶63 Opinion + report unmod · qual · adverse · disc AT-C 205 ¶65–69 Engagement quality EQR partner internal firm review PBC / RFI list ~150–400 items via auditor portal Sub-service procedures read sub-org SOC 2s CSOC procedures Sample selection AICPA AAG-COM judgment-based Subsequent events through report date AT-C 205 ¶57 Mgmt rep letter required AT-C 205 ¶50 Workpaper retention 5 yrs (AICPA) vs. 7 yrs PCAOB REPORT ASSEMBLY · SECTIONS I–V opinion · assertion · system desc · tests · mgmt response System Description → auditor's foundation period evidence → auditor sampling vendor SOC 2s → CSOC procedures mgmt assertion → opinion paragraph solid · primary procedure dashed · follow-on / parallel

Type 1 vs Type 2 — what your customer is really asking for

Type 1 design effectiveness — point-in-time REPORT DATE controls "suitably designed" as of this single moment When to use it — First SOC 2 ever; initial gate; you're <6 months into the program — Customer is a low-stakes prospect who'll accept a Type 1 to unblock a deal — You need a stake in the ground before pursuing Type 2 Limitation: only limited operational evidence — controls observed at a single date. Type 2 operating effectiveness — over a period PERIOD START PERIOD END controls operated effectively across this entire window When to use it — Mature program; what your customers' procurement teams will actually require — First Type 2: usually 3-month period. Subsequent: 6 or 12 months — Bridge letter covers gap from prior period end to current report date Reality: this is the report that closes deals.

The reality of a SOC 2 program

SOC 2 is not certified — it is opined upon. There is no "passing" a SOC 2 the way you pass an ISO 27001 certification audit. The CPA firm forms an opinion on whether your controls met the criteria. That opinion can be unqualified (clean), qualified (one or more exceptions), adverse (controls did not meet the criteria), or disclaimed (auditor couldn't gather enough evidence). Most reports are unqualified, but many ship with documented exceptions in Section IV.

Scope is yours to define. You pick the Trust Services Criteria. Security is always required. Availability, Processing Integrity, Confidentiality, and Privacy are elected based on what you commit to your customers. Two SOC 2s for two companies in the same industry can therefore be entirely different reports. This flexibility is a feature for the practitioner and a confusion for the buyer.

The system description is the underrated artifact. Section III of the report describes the system being audited — its boundaries, its components, its data flows, its control environment, the CUECs your customers must implement, and the sub-service organizations you depend on. The auditor's opinion is about the system as you described it. Get the description wrong and the opinion is misleading even if every control passes.

The auditor opines on what you said. Most SOC 2 surprises trace back to a system description that was written aspirationally rather than accurately.

The bridge letter is real and important. Your Type 2 report covers a period that ended (say) Sept 30. It's now January and your customer asks if controls are still operating. The bridge letter — issued by management, not the auditor — attests that no material changes have occurred between Sept 30 and the date of the letter. It is not an audit product, but it is an evidence product. Build the muscle to issue them quickly.

II.
Layer 02 — Control universe

Five categories, thirty-three Common Criteria

AICPA TSP Section 100
(2017 + 2022 Revised Points of Focus)

The five Trust Services Criteria — Security is mandatory; the rest are elected

CATEGORY · MANDATORY Security CC1 — CC9 · 33 criteria required for every SOC 2 CATEGORY · ELECTED Availability A1.1 capacity · A1.2 monitoring A1.3 recovery procedures 3 criteria · BCDR · uptime SLA Test: DR test evidence · capacity reports · monitoring dashboards Elect when: SaaS w/ uptime SLA CATEGORY · ELECTED Confidentiality C1.1 identification & protection C1.2 disposal of confidential info 2 criteria · classification · disposal Test: data classification policy · access controls · destruction logs Elect when: customer data is "confidential" CATEGORY · ELECTED Processing Integrity PI1.1 specifications · PI1.2 inputs PI1.3 processing · PI1.4 outputs PI1.5 stored data integrity 5 criteria · most rarely elected Test: input validation · output recon Elect when: financial / transaction processing CATEGORY · ELECTED Privacy P1 notice · P2 choice · P3 collection P4 use · P5 access · P6 disclosure P7 quality · P8 monitoring 8 criteria · GAPP-aligned Test: privacy notices · consent records Elect when: PII collected from individuals Each elected category brings additional criteria layered on top of the 33 Common Criteria. They share evidence, but the auditor tests each separately.

The 33 Common Criteria — Security's nine families

CC1 Control Env. 5 criteria CC1.1 integrity & ethical values CC1.2 BoD oversight CC1.3 reporting structure CC1.4 competence CC1.5 accountability CC2 Communication 3 criteria CC2.1 quality info CC2.2 internal communication CC2.3 external communication CC3 Risk Assessment 4 criteria CC3.1 specify obj. CC3.2 identify risks CC3.3 fraud risks CC3.4 changes assessed CC4 Monitoring 2 criteria CC4.1 ongoing & separate eval CC4.2 deficiencies communicated CC5 Control Activity 3 criteria CC5.1 select & develop activities CC5.2 technology controls CC5.3 policies & procedures CC6 Logical & Physical 8 criteria — largest CC6.1 access mgmt CC6.2 registration CC6.3 modification CC6.4 physical CC6.5 disposal CC6.6 boundary CC6.7 transmission CC6.8 prevent malware CC7 System Operations 5 criteria CC7.1 detect changes CC7.2 monitor anom CC7.3 evaluate sec events CC7.4 incident resp CC7.5 recover CC8 Change Mgmt 1 criterion CC8.1 authorize, design, develop, configure, document, test, approve, deploy infra and software changes CC9 Risk Mitigation 2 criteria CC9.1 BCM identify disruptions CC9.2 vendor mgmt — TPRM PRACTITIONER NOTE · WHERE THE WORK ACTUALLY LIVES CC6 (Logical & Physical) holds 24% of all Common Criteria — and roughly 50% of audit time. Access management, change to access, MFA, encryption, and boundary protection are where the auditor will spend their walkthrough hours. CC1 (Control Environment) is where the program either has tone or doesn't — entity-level controls cascade. CC8 has only one criterion but tests all of change management. CC9.2 (vendor management) has become the single largest area of new work as customers push down TPRM requirements onto their service providers. The 2022 Revised Points of Focus added significant detail to CC2.3 (external comms) and CC3.4 (change risk).

Why SOC 2 looks easy and is not

SOC 2 is principles-based, which means flexible — and which means traps. The TSC tells you that "the entity restricts logical access" (CC6.1). It does not tell you how. You design the controls; the auditor evaluates whether they meet the criterion. Two startups can satisfy CC6.1 entirely differently — one with strict role-based access, another with attribute-based. Both can pass. Both can also fail when their evidence doesn't match what the description claims.

Points of Focus are the hidden teeth. Each Common Criterion is accompanied by Points of Focus — illustrative considerations the auditor uses to evaluate whether you've actually met the criterion. The 2022 Revised Points of Focus expanded these meaningfully. They aren't requirements, but if your control doesn't address the relevant Points of Focus, the auditor will ask why. If you can't answer, that's a gap.

The Common Criteria are the floor; the elected categories raise it. Most companies start Security-only. Adding Availability requires real BCM evidence. Adding Confidentiality requires data classification you may not have. Adding Privacy requires privacy notices and consent records that triggers an entirely different program. Add categories deliberately, not aspirationally.

The hardest SOC 2 questions aren't "do you have a control?" They are "does this evidence convince me you ran it the way the description says you ran it?"
III.
Layer 03 — Evidence

Period of coverage, sampling, and the description

AICPA AAG-COM (Audit Sampling)
AT-C 205 ¶33–46

SOC 2 sampling — judgment-based, period-aware

CONTROL TYPE FREQUENCY 3-MO PERIOD 6-MO PERIOD 12-MO PERIOD PRIMARY EVIDENCE TECHNIQUE Automated configuration · system Continuous 1 1 1 + change history over period Configuration export, change-history report Inspection + Reperformance Daily manual Daily 15 25 25 Recon, reviewer sign-off Inspection Weekly Weekly 5 10 15 Patch reports, vuln scan reviews Inspection Monthly Monthly 2 3 4–5 Backup logs, vendor reviews Inspection + Inquiry Quarterly 4/yr 1 2 2 UAR, access reviews Inspection Annual 1/yr 0–1 1 1 if it occurred in period Risk assessment, policy review, DR test Inspection Event-driven JML · changes · incidents Per occurrence 25 / 10% 25 / 10% 25 / 15% Tickets, approvals, term/onboard logs Insp + Reperform

The System Description — what the auditor actually evaluates

1 · Services / system what the system is and what it does 2 · Components infra · software · people · procedures · data 3 · Boundaries what's IN and OUT of scope 4 · Sub-service orgs carve-out vs inclusive · named here 5 · CUECs customer responsibilities · explicit and tested DESCRIPTION CRITERION 1 (DC §200) Is the description fair? — Does it describe boundaries accurately? — Are sub-service organizations identified? — Are CUECs explicit and complete? — Are control activities described, not just policies? A material misstatement here = qualified opinion CRITERIA 2 & 3 (DC §200) Are controls suitably designed & operating? — Did the controls described actually exist (Type 1)? — Did they operate effectively over the period (Type 2)? — Were exceptions evaluated and remediated? — Do CUECs assumed by the design actually get done? Failures here = qualified or adverse opinion The auditor's opinion paragraph addresses all three criteria. Most practitioners obsess over criterion 3 and forget criterion 1.

CUECs & the chain of trust — the half of SOC 2 nobody draws

CUECs · COMPLEMENTARY USER ENTITY CONTROLS · DC §200 ¶DC.6 YOU · SERVICE ORGANIZATION "We provide the capability." — Build the system — Operate controls within your boundary — Disclose what customers must do — Test those disclosures aren't fictional e.g., "We support MFA" CUEC published in Section III SECTION III · SYSTEM DESCRIPTION "To rely on this system, user entities must..." — enforce MFA on their accounts — review their own user list quarterly — restrict admin role assignments — maintain their own incident process explicit, testable, complete customer reads the SOC 2 YOUR CUSTOMER · USER ENTITY "We must do our part." — Implements each CUEC as own control — Maps your CUECs to their SOX/SOC 2/ISO — Their auditor tests CUEC implementation — Without CUEC compliance, your controls don't transfer protection to them THE TWO-AUDITOR PROBLEM YOUR auditor reads the CUECs to evaluate whether the description is fair. THEIR auditor reads the same CUECs to test their controls. One sentence in your report becomes a control test in someone else's audit. A WELL-FORMED CUEC ↓ "User entities are responsible for" → implementing multi-factor authentication → for all administrative accounts → on the customer-controlled portal → within 14 days of account creation Specific control. Specific scope. Testable. A user entity auditor can read this and design a control test in 30 seconds. CUEC ANTI-PATTERNS ↓ "User entities are responsible for protecting their accounts." — vague: protect from what? how? when? "Maintain appropriate security for their environment." — unscoped: covers everything, controls nothing CUEC list copied verbatim from a template — mentions controls your system doesn't actually require No CUECs at all — description claims customers contribute nothing. Implausible.

Where SOC 2 evidence work breaks down

The judgment problem. Unlike SOX's prescribed sample sizes from PCAOB AS 2315, SOC 2 sampling under AICPA AAG-COM is judgment-based. Auditors negotiate. A startup with weak history and lots of exceptions will see larger samples than a mature program with three years of clean Type 2s. Don't take a single firm's "industry standard" as gospel — sample-size discussions are part of every engagement.

CUECs are the chain of trust. The diagram above shows what most SOC 2 teams underestimate: every CUEC you publish becomes a control test in your customer's audit. A vague CUEC ("maintain appropriate security") leaves your customer's auditor with nothing to test, which means your customer cannot rely on your SOC 2 to support their control story. A specific CUEC ("enforce MFA on all admin accounts within 14 days of provisioning") gives them a testable control. The discipline is to write CUECs the way you would want to receive them from a vendor.

Period thinking is foreign to most teams. SOX is point-in-time at year-end. SOC 2 is over a period. A control that operated three months out of six is a Type 2 exception even if it works today. The corollary: if a control was added mid-period, the auditor will only test from its implementation date — and your description must say so explicitly.

The IPE problem applies here too. Reports used as evidence (UAR exports, vulnerability scan results, employee rosters) must be tested for completeness and accuracy. SOC 2 firms vary in how rigorously they test IPE — some are lax, some are PCAOB-strict. Either way, document the source, parameters, and reconciliation for every report.

Automation is a double-edged sword. Vanta, Drata, Secureframe automate evidence collection — wonderful. But auditors increasingly want to see how the automation works (the "control over the control"). If your continuous-monitoring tool flags a deviation and someone clicks "approve" without investigating, that's worse than no automation. The platform doesn't replace the discipline; it surfaces it.

IV.
Layer 04 — Cross-framework

SOC 2 as the bridge between frameworks

SOC 2 evidence reused widely;
scope nuances persist
SOC 2 domain SOX ISO 27001:2022 NIST CSF 2.0 PCI DSS v4.0.1 HIPAA HITRUST v11 Shared evidence
CC1 — Control Environment ELC · COSO Comp. 1 Cl. 5 · A.5.1–A.5.4 GV.RR · GV.PO Req 12.4 §164.308(a)(2) 00.a · 02.a Org chart, code of conduct, board minutes, tone-at-top attestations
CC2 — Communication & Info ELC · COSO Comp. 4 A.5.14 · A.6.3 GV.OC · PR.AT Req 12.6 §164.308(a)(5) 02.e · 02.f Training records, intranet posts, customer notifications
CC3 — Risk Assessment ELC · risk assessment memo Cl. 6.1 · A.5.7 ID.RA · GV.RM Req 12.3 §164.308(a)(1)(ii)(A) 03.a · 03.b Risk register, threat model, RCSA, fraud-risk assessment
CC4 — Monitoring ELC · COSO Comp. 5 Cl. 9.1 · A.8.16 DE.CM · ID.IM Req 10 · Req 12.10 §164.308(a)(1)(ii)(D) 09.aa · 06.h Internal audit reports, mgmt reviews, KRI dashboards
CC5 — Control Activities BPC · ITGCs at large A.5.36 scattered Req 1Req 12 mapped §164.306 10.b · 09.h Policies, procedures, control-activity matrix
CC6.1–6.3 — Logical access ITGC — Access A.5.15 · A.5.16 · A.5.18 · A.8.2 PR.AA-01 · PR.AA-05 Req 7 · Req 8 §164.308(a)(3) · §164.308(a)(4) 01.b · 01.c · 01.v JML tickets, UAR exports, IAM config, MFA enforcement
CC6.4–6.5 — Physical access ITGC — Operations A.7.1–A.7.14 PR.AA-06 · PR.IR-02 Req 9 §164.310 08.b · 08.j Badge logs, CCTV retention, visitor records, asset disposal
CC6.6 — Boundary protection ITGC — Operations A.8.20 · A.8.22 PR.IR-01 Req 1 §164.312(e)(1) 09.m Firewall config, network diagram, segmentation tests
CC6.7 — Transmission protection ITGC — Operations A.8.24 PR.DS-02 Req 4 §164.312(e)(2)(ii) 09.s TLS scan, cert inventory, VPN config
CC6.8 — Malware protection ITGC — Operations A.8.7 PR.PS-05 Req 5 §164.308(a)(5)(ii)(B) 09.j EDR/AV deployment reports, scan logs, IOC alerts
CC7 — System operations ITGC — Operations A.8.15 · A.8.16 · A.5.24–28 DE.CM · RS.MA Req 10 · Req 12.10 §164.308(a)(6) 09.aa · 11.a–c SIEM rules, incident tickets, post-incident reviews
CC8.1 — Change management ITGC — Change A.8.32 PR.PS-06 Req 6.5 §164.308(a)(8) 10.h · 09.b Change tickets, CAB minutes, PR approvals, deploy logs
CC9.1 — BCM BPC · resilience A.5.29 · A.5.30 RC.RP Req 12.10 §164.308(a)(7) 12.b · 12.c BIA, BCP/DR plans, last-test report, RTO/RPO documentation
CC9.2 — Vendor management ITGC + BPC · TPRM A.5.19–A.5.23 GV.SC Req 12.8 · Req 12.9 §164.308(b) · BAAs 05.k Vendor inventory, due diligence pkg, contracts, vendor SOC 2s
A1 — Availability (elected) ITGC — Operations A.8.6 · A.8.13 · A.8.14 RC.RP · PR.IR-04 Req 12.10 §164.308(a)(7) 12.b Capacity reports, uptime SLAs, DR test evidence
C1 — Confidentiality (elected) ITGC + BPC A.5.12 · A.5.13 · A.8.10 PR.DS-01 · PR.DS-02 Req 3 §164.514 06.c · 10.f Data classification policy, encryption inventory, disposal cert
PI1 — Processing Integrity (elected) ITAC · directly A.8.26 PR.PS-06 10.b Input validation, output recon, error log review
P1–P8 — Privacy (elected) scope-dependent A.5.34 · ISO 27701 PR.DS-01 · GV.PO Req 3 HIPAA Privacy Rule 06.c · 06.f Privacy notice, consent records, DSAR log, retention schedule

SOC 2 is the bridge — and the trap

SOC 2 has become the GRC industry's lingua franca. Buyers demand it; vendors deliver it; everyone has a vague sense of what it covers. The crosswalk above shows why: SOC 2's Common Criteria map cleanly to almost every other framework's control families. Build SOC 2 right, and you're 60–80% of the way to ISO 27001 and HITRUST.

The trap: SOC 2 is the flexibility champion. The same Common Criterion can be satisfied by very different controls in different programs. ISO 27001's Annex A is more prescriptive; HIPAA Security Rule is more prescriptive; PCI DSS is highly prescriptive. Reusing SOC 2 evidence for ISO 27001 audit works only when your SOC 2 controls happen to address ISO's specific Annex A requirements — which is partially overlapping, never identical.

SOC 2 evidence is reusable. SOC 2 conclusions are not.

The shared-evidence column is the unit of work. JML tickets, UAR exports, change tickets, vendor SOC 2s, IAM configs — these artifacts satisfy the access-control control across all seven frameworks simultaneously. The opportunity for GRC engineering is to build the evidence pipeline once and run framework-specific test conclusions against the same data.