Usability Test Script Creation
Generate comprehensive usability test scripts for specific features, including test scenarios, tasks, success criteria, and debrief questions.
v3
Last updated: November 6, 2025
Testing
Product Manager
usability
testing
Loading...
Generate comprehensive usability test scripts for specific features, including test scenarios, tasks, success criteria, and debrief questions.
# Usability Test Script Creation Act as a Product Manager creating a usability test script for evaluating user experience. ## Context - **Feature**: [Feature name] - **Target Users**: [User personas/segments] - **Test Duration**: [30/60/90 minutes] - **Test Format**: [Moderated/Unmoderated/Remote/In-person] ## Test Objectives 1. [Primary objective - e.g., "Validate new checkout flow reduces cart abandonment"] 2. [Secondary objective - e.g., "Understand user perception of new navigation"] 3. [Tertiary objective - e.g., "Identify friction points"] ## Pre-Test Setup ### Participant Screening - [ ] Demographics match target users - [ ] Previous experience with [product type] - [ ] Technical comfort level: [Beginner/Intermediate/Advanced] - [ ] Accessibility needs: [None/Screen reader/Keyboard only/etc.] ### Prerequisites - [ ] Test environment ready - [ ] Prototype/mockup available - [ ] Recording equipment set up - [ ] Consent forms prepared - [ ] Incentive/location confirmed ## Test Script Structure ### 1. Introduction (5 minutes) - Welcome and thank participant - Explain test purpose and format - Set expectations: "We're testing the product, not you" - Ask for permission to record - Get consent ### 2. Background Questions (5 minutes) - How familiar are you with [product category]? - What tools do you currently use for [task]? - What are your biggest pain points with [current solution]? ### 3. Task Scenarios (20-40 minutes) #### Task 1: [Primary Task] **Scenario**: [Context and background] **Goal**: [What user wants to accomplish] **Observations to note**: - [ ] How long does it take? - [ ] Where do they struggle? - [ ] Do they understand the interface? - [ ] What questions do they ask? - [ ] Do they complete the task? **Success Criteria**: [Defined outcome] --- #### Task 2: [Secondary Task] [Repeat structure] --- #### Task 3: [Edge Case or Comparison] [Repeat structure] ### 4. Post-Task Questions (10 minutes) - What did you think of that experience? - What was most confusing? - What worked well? - Would you use this feature? Why/why not? - How does this compare to [competitor/current solution]? ### 5. Wrap-up (5 minutes) - System Usability Scale (SUS) questionnaire - Overall impressions - Suggestions for improvement - Thank participant - Provide incentive/information ## Debrief Questions for Observers - What patterns did we see? - What surprised us? - What are the critical issues? - What worked better than expected? - What should we prioritize fixing? ## Analysis Framework After tests, analyze: 1. **Task Completion Rate**: X/Y participants completed task 2. **Time on Task**: Average X minutes (target: Y minutes) 3. **Error Rate**: X errors per participant 4. **Critical Issues**: [List top 3-5] 5. **Positive Findings**: [What worked well] 6. **Recommendations**: [Prioritized action items] ## Notes - Allow users to think aloud - Don't lead or provide hints - Observe body language and emotions - Note where users hesitate or ask questions - Capture exact quotes when possible
Get access to enhanced versions, advanced examples, and premium support for this prompt.
Loading revision history...
Apply what you learned with these prompts and patterns