All field notesSOC

The SOC 2 audit process, phase by phase

The SOC 2 audit process in real phases with honest timelines: scoping, readiness, observation window, fieldwork, draft, management review, issued report.

9 min read

Founders usually ask what a SOC 2 audit actually looks like and get a slideshow. What they want is a calendar: when does the auditor show up, what do they ask for, what does the company have to produce, and when does the PDF land.

This post is that calendar. It walks the engagement the way we actually run it, phase by phase, with realistic weeks on each.

The short answer

A SOC 2 audit is six phases: scoping and engagement letter, an optional readiness pass, the observation window (Type II only), fieldwork, exception handling with management responses, and the draft-to-issuance cycle. End to end, a first Type II from a standing start runs nine to fourteen months; a Type I runs three to five. The sections below are what happens inside each phase.

For the bigger picture of what SOC 2 is and who issues it, start with what is SOC 2 compliance. The governing framework is the AICPA's SOC 2 examination guidance, which defines the Trust Services Criteria and the description criteria auditors test against.

Phase 1, Scoping and the engagement letter

Nothing starts until scope is written down. In the first two or three working sessions we settle four questions with the client.

Which Trust Services Criteria apply. Security is mandatory. You add Availability, Confidentiality, Processing Integrity, or Privacy based on the commitments in your customer contracts, not based on what sounds thorough. Criteria you add, you pay to test every year.

Which systems are in scope. The production environment that serves the product, the supporting cloud accounts, the identity provider, the source code and CI systems, the ticketing system that carries change records. Not the marketing site, not the HR tool, not the sales CRM unless it actually stores customer data. Scope drift is the single most expensive mistake in first-year SOC 2 work.

The report period. For a Type I, a single as-of date. For a Type II, a start date and end date, typically six to twelve months apart. The end date is the one your sales team cares about because the report lands six to ten weeks later.

Subservice organizations, carve-in versus carve-out. Your cloud provider, your payroll platform, your email vendor. Most SOC 2 reports use the carve-out method, which excludes subservice organization controls from your scope and relies on their own SOC 2 reports (called complementary subservice organization controls, or CSOCs). Carve-in includes their controls in yours and is rare outside specific industry situations. For almost every SaaS company, carve-out is the correct answer and carve-in is an expensive mistake.

Once those four questions are answered, the engagement letter gets signed and the clock starts. A companion piece, SOC 2 compliance requirements, lists the policies and controls this phase usually flushes out.

Phase 2, Readiness assessment

A readiness pass is optional, common, and almost always worth it. It scopes controls against the chosen TSCs, produces a gap list, sets remediation owners and dates, and names the day the observation window can realistically open.

AICPA independence rules bar a CPA firm from performing readiness consulting and then issuing the SOC 2 opinion on the same engagement. We do one or the other. If we are signing your opinion, a separate team (ours or someone else's) runs readiness. If we are running your readiness, a different CPA firm issues the report. That separation is what keeps the opinion worth reading.

Our SOC 2 readiness assessment post covers what the gap list, evidence library, and remediation windows actually look like. Four to eight weeks is typical for a growth-stage SaaS.

Phase 3, The observation window

This phase only exists for Type II. A Type I is a point-in-time design opinion and skips the window entirely; see SOC 2 Type I vs Type II for when each is the right call.

For a Type II, the observation window is the period the auditor will eventually test across. Six months is the shortest window most buyers will accept for a first report. Twelve months is the target for steady state. Nothing the auditor tests happens inside the window itself, but everything the auditor tests happens during it.

During the window, the client is running the controls and collecting evidence. Access reviews fire quarterly. Vulnerability scans fire on cadence. Change tickets get approvals recorded. Onboarding and offboarding checklists produce artifacts. If a control fires on the calendar and nobody captures the evidence, the control did not fire as far as the auditor is concerned.

The auditor's job during the window is mostly to stay out of the way and to answer questions as they come up. We usually run a short mid- window check-in around month four or five to flag anything that looks like a sampling problem before the window closes, but we are not testing yet.

Phase 4, Fieldwork

Fieldwork is what most founders picture when they hear "audit." It starts after the observation window closes (Type II) or after the as-of date (Type I) and runs three to six weeks for most growth-stage SaaS engagements.

Fieldwork is four activities, usually running in parallel.

Interviews. We sit with the people who own controls: the head of engineering for change management, the IT lead for access, the person who runs the incident response rotation, finance for any controls that touch billing. Thirty to sixty minutes each. These are not ambushes; we send the questions ahead.

Walkthroughs. For each control, we watch it happen end to end. Walk me through an access request for a new engineer. Show me the last quarterly access review. Open a change ticket from last month and show me the approvals. Walkthroughs confirm the control is designed to meet the criterion.

Sampling. For controls that ran many times during the window, we pick a sample. Populations come from the client, signed off as complete; samples come from the auditor. A control that fires quarterly might get three of four quarters sampled. A control that fires on every deploy might get twenty-five of several thousand. Sample sizes follow AICPA attestation guidance under SSAE No. 18, not a platform's defaults.

Evidence pulls. For each sampled item, the client produces the artifact: the access review with approvals, the change ticket with the review, the vulnerability scan with its remediation evidence. This is where the evidence library built in readiness earns back its cost. Clients with a tidy library finish evidence pulls in days. Clients without one finish in weeks.

Phase 5, Exceptions and management responses

Every real Type II produces exceptions. A Type II with zero exceptions either had its scope drawn too narrowly or was not tested seriously. Buyers know this; see what a SOC 2 actually tells your buyer for what procurement is actually scanning for when they flip to Section 4 of the report.

When we find an exception during fieldwork, the workflow is consistent. We document what we tested, what we expected, what we found, and the deviation. The client writes a management response that explains what happened, what the root cause was, and what changed or is changing. Good management responses are short, specific, and forward-looking. Bad ones are defensive or vague, and procurement reads them that way.

Exceptions do not by themselves change the opinion. An unqualified opinion with a handful of exceptions and clean management responses is the report well-run companies ship. A qualified or adverse opinion shows up when exceptions are pervasive enough to matter to the overall design-and-operation conclusion, and that is a conversation we have with the client long before the draft.

Phase 6, Draft, management review, issuance

After fieldwork closes, the engagement enters the reporting cycle. This is six to ten weeks for most engagements and is where first-time clients get surprised by how long "just issue the report" actually takes.

Draft report. We compile the opinion, the management assertion, the system description (Section 3), and the full controls table (Section 4). First draft to the client usually lands two to three weeks after fieldwork closes.

Management review. The client reads the draft, particularly the system description and the exceptions. Corrections to factual descriptions get made here. The opinion does not get negotiated, but the description of the system does, and it is worth reading carefully because buyers will.

Quality review. A second CPA partner inside our firm reviews the engagement file and the draft report independently of the engagement team. This is required by AICPA quality management standards, the same regime enforced through the AICPA Peer Review Program. It usually adds one to two weeks and occasionally surfaces changes.

Issuance. The report is signed, dated, and delivered. The report date is the date the last piece of evidence supporting the opinion was obtained, which is almost always later than the report period end date. The gap between the period end and the report date is why SOC 2 bridge letters exist.

What the client actually does during each phase

Founders ask us what is on their plate. By phase:

  • Scoping: answer the four scope questions, sign the engagement letter, name a point person internally. Ten to fifteen hours over two weeks.
  • Readiness: close the gap list. This is the heaviest phase for the client. Two to five hours per week per control owner for four to eight weeks.
  • Observation window: run the controls and capture evidence on cadence. Thirty minutes a month per control owner if the evidence library is working. Several hours a week if it is not.
  • Fieldwork: sit for interviews, run walkthroughs, pull evidence for samples. Five to ten hours per week for the SOC 2 owner, one to three hours for each other control owner, for three to six weeks.
  • Exceptions: write management responses as findings come in. An hour per exception.
  • Draft and review: read the draft carefully, return comments on the system description, sign the management assertion. Five to ten hours across two weeks.

The company that treats SOC 2 as bookkeeping once a week ships a clean report. The company that treats it as a fire drill every quarter ships exceptions.

Typical calendar

For a growth-stage SaaS running a first Type II:

  • Weeks 1–2: scoping and engagement letter.
  • Weeks 3–10: readiness pass, four to eight weeks.
  • Weeks 11–12: prep for window open, remediation items close.
  • Months 4–15: twelve-month observation window (months 4–9 for a six-month window).
  • Weeks after window closes, 1–5: fieldwork.
  • Weeks 6–8: draft report and management review.
  • Weeks 9–10: quality review and issuance.

For a Type I from a standing start: weeks 1–2 scoping, weeks 3–10 readiness, weeks 11–14 fieldwork, weeks 15–18 draft through issuance. Three to five months total.

For an annual Type II in steady state: roughly twelve months end to end, timed so the report date lands two to three months after the window closes, with a bridge letter covering the gap until the next window.

Our process is built around these timelines so the report is ready when the sales cycle needs it, not three months after.


If you are scoping a first SOC 2 and want a real calendar instead of a range, get in touch and we will map the phases against your target report date.

§ Related notes
All field notes →