Design Evaluation Flashcards

1
Q

Why evaluate design?

A
  • Improve usability
  • Reduce flaws after launch
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What part of the design is being evaluated?

A
  • All levels, parts, and attributes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Where is the design being evaluated

A
  • Laboratory (for controlled scenarios and internal validity)
  • Natural Setting (for realistic scenarios and external validity)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When to evaluate design?

A
  • Anytime
  • Formative process (before formation — qualitative)
  • Summative process (after formation — quantitative)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are evaluation issues?

A
  • Surroundings (Environment must be considered in the evaluation_
  • Bias (Undiversified sample group)
  • Hawthorne Effect (Reactivity due to awareness of being observed)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Analytical Evaluation?

A
  • Evaluation that doesn’t involve users
  • Predict user behaviour and identify usability problems
  • Discover design issues before usability testing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Heuristic Evaluation?

A
  • Fast usability engineering
  • Simple prototypes
  • 3-5 Users (testers)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What steps are in Heuristic Evaluation?

A
  1. Evaluators evaluate the interface multiple times — compare components with usability principles
  2. Evaluators aggregate their findings
  3. Reveals usability problems
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Usability Testing?

A
  • Identify problems in a product’s design and learn about user behaviour and preference
  • Measure performance and satisfaction with a system
  • Laboratory setting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What components are in Usability Testing?

A
  • Participants
  • Tasks
  • Facilitator
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What steps are in Usability Testing?

A
  1. Prepare tasks, find participants, and set up test materials
  2. Invite participants, and observe and ask questions
    1. Give scenario verbally and textually
    2. Give tasks one at a time
    3. Give short breaks if tasks are long
    4. Take notes and collect data (audio and video)
    5. Debrief participants after tasks
  3. Analyze data, find problems, summarize results, and make recommendations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do you effectively collect data?

A
  • Think-Aloud Protocol
    • During testing — participants talk about how they feel during tasks
    • After testing — participants recall how they feel after a task
  • Audio and video recording
  • Questionnaires
    • Start with general demographic information
    • Use open-ended questions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What type of questions are there?

A
  • Likert Scale
  • Semantic Scale
  • Ranking Questions
  • Open-ended Questions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the UEQ?

A
  • 26 pairs of contrasting attributes to be answered
  • Semantic scale questions
  • Comes with data analysis tools
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the UEQ-S?

A
  • Shorter version of UEQ
  • 8 pairs instead of 26
  • Aggregates into 3 scales:
    • Pragmatic quality — practical, goal directed
    • Hedonic quality — appeal, non-goal directed
    • Overall
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is qualitative data/analysis?

A

Observational-based
- Facilitator takes notes and recordings
- Participants write/voices comments

17
Q

What is Quantitative Data/Analysis?

A

Assessment-based
- Measures from participant tasks and questionnaires

18
Q

What should you look out for during Qualitative Analysis?

A
  • Frequently occurring problems
  • Workarounds
  • Enjoyment
19
Q

What do you do with results from Qualitative Analysis?

A
  • Prioritize problems based on severity and use usability goals to redesign
  • Highlight the good results
  • Provide recommendations, not rules (consider constraints such as branding and standards)