← All posts

What SOWs Actually Need to Say About Acceptance

Acceptance criteria are the most consistently under-specified part of a scope. Here's what to include — and why it matters more than you think.

SOWacceptance criteriascope managementdelivery riskclient management

It's one of the most common phrases in statements of work: "Deliverables will be accepted by the client upon satisfactory completion."

That sentence does nothing. It looks like a contract term, but it contains no enforceable information. What is satisfactory? Who decides? By when? What happens if they don't respond?

Vague acceptance language isn't a minor oversight. It's the source of some of the most persistent, margin-destroying disputes in client services. And it's nearly always avoidable.

What Acceptance Criteria Actually Need to Specify

Good acceptance criteria answer four questions:

1. What exactly is being accepted?
Not "the website" — the specific deliverables: "Ten published pages matching approved Figma designs, passing WCAG 2.1 AA automated checks, loading in under 3 seconds on a standard broadband connection."

2. How will it be evaluated?
Is it a functional review? A checklist? A user acceptance testing (UAT) period with a defined scope? Specify the process, not just the outcome.

3. Who has authority to accept?
Named roles, not departments. "The client's project lead, as identified at project kick-off" is better than "the client." This matters when stakeholders change mid-engagement.

4. What is the timeframe?
If the client doesn't respond within [X] business days, what happens? "Deemed accepted" clauses are standard in well-drafted SOWs. Without one, you're waiting indefinitely for sign-off that may never come.

The Revision Scope Problem

Revision cycles deserve their own section. "Two rounds of revisions" is almost meaningless without defining:

  • What constitutes a revision versus a new requirement
  • Whether revisions apply per-deliverable or to the engagement overall
  • Whether consolidated or fragmented feedback within a round counts as one round or multiple

Without this clarity, a client who provides five separate batches of feedback on the same deliverable can reasonably argue that they've only used one round of revisions. You'll disagree. The project will stall.

A Simple Framework

When drafting or reviewing acceptance terms, run through this checklist:

  • Are the deliverables named specifically (not categorically)?
  • Is the evaluation method defined?
  • Is the accepting party named by role?
  • Is there a response window with a deemed-accepted fallback?
  • Are revision rounds defined with clear rules about what counts?
  • Are change requests — requirements that fall outside the defined deliverable — handled by a separate process?

If any of these are missing, you have a gap. The question is whether you find it before the engagement starts or after the client does.

Reviewing Your Own SOWs

One useful habit: read your acceptance terms with a hostile eye. Ask yourself: if this relationship went wrong, how would the client invoke this language against us? That framing surfaces problems that an optimistic read misses.

Preflight does this systematically. When you upload a SOW, it runs the same analysis: looking for acceptance language that lacks criteria, revision terms that are undefined, and delivery milestones that don't have a clear sign-off process. You get a structured report before the engagement starts — which is when fixes are still easy.


Have a complex SOW you want to review? Upload it to Preflight →