Send this email 1-2 days before UAT handoff to set expectations.
/send:uat-preframe {CODE} --workflow "{Name}"
Why this matters: Clients often treat UAT as a formality — they watch the demo, say "looks good", then panic when something breaks in production. This email reframes UAT as active stress-testing.
Part 1: Credential Swap (15 min)
Manual verification:
Open workflow in n8n
Click each integration node
Verify credential dropdown shows {CODE}-* not DEV-*
Update Variables node: environment: "production"
Update tags: status-development → status-uat
Part 2: Smoke Test (15 min)
Run workflow with test data
Verify executes successfully with client credentials
Check output reaches client systems (CRM, email, etc.)
If test fails:
Part 3: Demo Recording (20 min)
Tool: Loom desktop app (not Chrome extension)
Get checklist:
Demo structure (2-5 min):
Section
Time
Content
Intro
15s
"Hi {Name}, I'm showing you the {Workflow}..."
Overview
30s
Show workflow canvas, explain flow
Trigger
30s
What starts the workflow
Execution
1-2 min
Run it, show data flowing
Output
30s
Final result in client system
Close
15s
"Your job now is to stress-test this..."
Video title:{Client} - {Workflow} Demo
Part 4: Handoff Email (15 min)
Email includes:
Loom video link
Testing mission ("try to break it")
What to test checklist
How to report issues
UAT Review Call details
Part 5: Client Testing Period (3-5 days)
Give the client time to actually test. Don't rush this phase.
During this period:
Be available for questions
Send /send:uat-feedback-request if no response after 2 days
Document any issues they report
Part 6: UAT Review Call (45 min)
This is NOT a demo call — it's a review of the client's testing findings.
What to Test (Client Guidance)
Give the client this checklist (included in handoff email, can also share separately):
Test Type
What to Try
Example
Happy path
Normal use case
Standard invoice with all fields filled
Edge cases
Unusual but valid scenarios
Very long product names, special characters
Volume
Multiple items at once
Batch of 10 invoices instead of 1
Missing data
Empty or null fields
Invoice without a PO number
Bad data
Incorrect formats
Date in wrong format, invalid email
Timing
When things happen
What if trigger fires twice quickly?
Permissions
Access issues
What if someone doesn't have access to the output location?
Encourage them: The best testers are the people who use the process daily. They know the weird scenarios we can't anticipate.
UAT Review Call (45 min)
Section
Time
Focus
Their Findings
15 min
What they tested, what broke
Live Troubleshooting
10 min
Reproduce issues together
Fixes & Changes
10 min
What we'll adjust
Go/No-Go
5 min
Ready for sign-off, or another round?
Next Steps
5 min
Timeline to go-live
Key shift: The client should be presenting THEIR findings, not passively watching us demo.
If Client Reports No Issues
Why this matters: "No issues" sometimes means "I watched the demo and didn't actually test." Probe gently to ensure real testing happened.
Email Templates
uat-preframe
When to send: 1-2 days before UAT handoff Command:/send:uat-preframe {CODE} --workflow "{Name}"
uat-handoff
When to send: After demo video recorded, before UAT Review Call Command:/send:uat-handoff {CODE} --workflow "{Name}" --loom-url "{URL}"
uat-feedback-request
When to send: 2 days after handoff email if no feedback received Command:/send:uat-feedback-request {CODE} --workflow "{Name}"
Client says "it all looks good"?
├── Did they actually test? → Ask: "What scenarios did you try?"
│ ├── Vague answer → Encourage specific testing: "Can you try [edge case]?"
│ └── Specific answer → Great! Move to sign-off
└── Genuinely no issues → Celebrate, but still run through edge cases on UAT call
Subject: Next Up: Your Turn to Test {Workflow Name}
Hi {Name},
Quick heads up — we're finishing internal testing on {Workflow Name} and you'll have it in your hands within the next few days.
WHAT'S COMING
You'll get a demo video and access to test the workflow yourself.
YOUR JOB
Try to break it. Seriously.
UAT (User Acceptance Testing) is where YOU stress-test the workflow with real scenarios. The goal isn't to watch a demo and say "looks good" — it's to find the edge cases and issues BEFORE we go live.
THINGS MIGHT BREAK — AND THAT'S OKAY
If something fails during testing, that's a success. We'd much rather find issues now than after go-live. That's the whole point of this phase.
I'll send the full handoff email shortly with the demo video and testing instructions.
Get ready to put it through its paces!
Cheers,
{Sender Name}
The Entourage AI
Subject: {Workflow Name} Ready for Testing — Go Break It
Hi {Name},
Your {Workflow Name} workflow is ready for you to test.
DEMO VIDEO
{Loom URL}
(2-5 min walkthrough of how it works)
---
YOUR MISSION: TRY TO BREAK IT
This is User Acceptance Testing — your job is to stress-test this workflow before we go live. Don't just watch the demo and say "looks good." Actually use it.
WHAT TO TEST
□ Normal scenarios — Does it work as expected?
□ Edge cases — What happens with unusual inputs?
□ Volume — Does it handle multiple items at once?
□ Missing data — What if a field is empty?
□ Wrong data — What if something is formatted incorrectly?
HOW TO REPORT ISSUES
When you find something (and you probably will — that's the point):
1. Screenshot or describe what you did
2. What you expected to happen
3. What actually happened
4. Send to: {Email or Slack channel}
REMEMBER
Finding issues now = success. That's what this phase is for. We'd much rather fix things now than after go-live.
---
UAT REVIEW CALL
Date: {Date}
Time: {Time} {Timezone}
Link: {Meeting URL}
We'll walk through your findings, answer questions, and discuss any changes.
Go put it through its paces!
Cheers,
{Sender Name}
The Entourage AI
Subject: Following Up - {Workflow Name} Testing
Hi {Name},
Just checking in on your {Workflow Name} testing.
Have you had a chance to put the workflow through its paces?
QUICK REMINDER
The goal is to stress-test it before go-live:
• Try normal scenarios
• Test edge cases (unusual inputs, missing data)
• See what breaks
FOUND ISSUES?
Great! That's the point. Send them my way.
NO ISSUES?
Also great — but make sure you've actually tested, not just watched the demo.
Our UAT Review Call is coming up on {Date}. Come prepared with:
• Scenarios you tested
• Any issues you found
• Questions about edge cases
Reply with your feedback or let me know if you need more time.
Cheers,
{Sender Name}
The Entourage AI