1) What this tool is for
This is a reusable pattern for turning Customer Voice surveys into actionable Customer Service work. It automatically:
- sends a survey when a Case is created
- parses the survey response JSON
- scores the results
- writes a Positive/Negative outcome directly back onto the Case record
2) When to use it (and when not to)
Use it when:
- You want a consistent way to collect feedback without manual sending
- Your support team needs results visible where they work (on the Case), not buried in a separate dashboard
Don’t use it when:
- You don’t have a reliable “survey should be sent now” moment
(example: Cases are created in bulk via system alerts) - You need heavy conditional logic that changes constantly to determine who gets a survey
(that’s how flows turn into archaeology)
3) Pre-flight checklist
Before you build anything, confirm:
- Customer Voice survey is created (questions finalized enough to test)
- Customer Voice email template/invitation is ready to send
- Your Case table in Dataverse has a custom field to store the outcome
(example: Survey Outcome = Positive/Negative) - You know exactly which Case types should be excluded from receiving surveys
4) Step-by-step
Step 1: Set up Customer Voice
- Build your survey and configure the email template
- Test small first: send yourself one manual invitation to confirm:
- delivery
- branding
- link functionality
Step 2: Flow #1 — Send the survey
Create a flow that triggers when a Dataverse Case row is added.
A) Check exclusions (do this cleanly)
Consultant law #12: hardcoded GUIDs are just bugs waiting for a Friday afternoon.
- Store excluded Case categories/types in a small config table
- Look them up in the flow
- If the Case is excluded → Terminate
Do not be a hero and build a 100-line filter condition in the trigger.
B) Send the invite
- Use the Customer Voice action to send the invitation to the Case Contact
- Use the Case ID in the Regarding field
C) Test small first
- Create a Case in a sandbox with a non-excluded category
- Confirm:
- the flow fires
- the email looks right
- the survey link works
Step 3: Flow #2 — Score and update
This is where the heavy lifting happens. Customer Voice stores answers inside a JSON string, so we have to unpack them before we can score anything.
A) Trigger
Create a flow that triggers when a Customer Voice survey response is:
- Added or Modified
B) Parse JSON
- Add a Parse JSON action
- Feed it the Context Data (or the survey response details field) from your trigger
- Generate the schema:
- run the flow once
- copy the raw output of that field
- click Generate from sample
This tells Power Automate to expect an array containing questionId and response pairs.
C) Filter Arrays (one per scored question)
For each question you want to score, add a Filter array step (name them clearly, like Filter Array Q1, Filter Array Q2).
- From: the Body output from Parse JSON
- Condition:
questionIdequals the GUID for that question
Note: for clean ALM, consider using a Text Environment Variable for each question ID instead of hardcoding GUIDs.
D) Extract the value (the secret sauce)
A Filter array always returns a list, even if there’s only one result.
If you drop that list into a Condition, Power Automate will push you into an Apply to each loop.
To avoid that, use a Compose action that:
- grabs the first match
- handles blanks safely
- converts the response to an integer
Create a Compose step with this expression:
@{int(coalesce(first(body('Filter_array_Q1'))?['response'], '999'))}
Why this works
first()pulls the single item from the filtered list?['response']grabs the answercoalesce(..., '999')prevents crashes on blank responsesint()converts the value so numeric comparisons work reliably
Repeat this Compose pattern for each scored question.
E) Add the scoring condition
Add a Condition block with an OR rule and your thresholds.
Example:
- If Q1 < 3 OR Q2 < 3 OR Q3 < 7 → Negative
Note: because blanks become 999, they will fail the “less than” checks safely and won’t create false negatives.
F) Update the Case
- If Yes: Update the Case (or Ticket) row
- use the Regarding value to identify the Case
- set Survey Outcome = Negative
- If No: Terminate
G) Test small first
- Submit one “good” survey and one “bad” survey
- Confirm only the bad one hits the “If Yes” path and updates the Case outcome
5) Common gotchas
-
Flow #2 doesn’t trigger at all
Customer Voice may create the response row first, then update it with a submit date later.
If your trigger is only “Added,” you might miss it. Use Added or Modified. -
Question IDs changed after deployment
If you copy/duplicate the survey into a new environment, thequestionIdGUIDs can change.
Your Filter arrays will silently return nothing. Always verify IDs after deployments.
6) Validation checklist
Future You will thank you if you run these checks before handing this over:
- Create a Case that should send a survey → verify the invite arrives
- Create an excluded Case → verify the flow terminates early and sends nothing
- Submit a low score → verify:
- Parse JSON works
- scoring Condition evaluates to True
- Case outcome updates to Negative
- Submit a high score → verify:
- Condition evaluates to False
- flow terminates peacefully
7) Real scenario
A tier-1 support team wants customer feedback, but they refuse to leave Dynamics to check Customer Voice dashboards.
By implementing these two flows, support leads can pin a view like:
- “Cases with Negative Survey Outcome”
…to their main dashboard and follow up immediately, without chasing survey exports or forwarded email chains.