Block 4 Flashcards

(59 cards)

1
Q

Why evaluate

A

Business reputation, Financial success, Being relevant in complex, real world, keeping up with tech

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Business reputation

A

Negative feedback can damage company

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Financial success

A

Appropriate evaluation carried out at right points in interaction process is a prudent investment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Being relevant in complex, real world

A

By testing with real people in real contexts, you can pick up on cultural aspects that you might not have anticipated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Keeping up with tech

A

Evaluating with users allows designers to experiment with prototypes of novel models of interaction, and gain fresh insights into emerging interaction paradigms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Iterative design

A

Need at least 15 users to discover all usability problems in design

Better to distribute evaluations over smaller groups

If you have funding for 15 users spend this on 3 studies with 5 users in each

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Evaluation methods

A

User testing - Investigation of whether there is a need for the design

Usability testing - Evaluation of whether the system is usable by the intended users

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Choosing and combining methods

A

Many methods available in evaluation toolkit

Typically finding a fault requires a combination of methods

Not all evaluations need to be detailed

Opportunistic evaluation is done informally and gets quick feedback. This is done early in the design process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Evaluating for accessibility

A

Some instances in which experts may be used to provide info that cannot be ascertained from users

E.g. visual impairment, aurally challenged

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

New methods

A

New methods and variations on existing methods constantly being added to evaluation repertoire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Ethical issues and informed consent

A

Becoming increasingly sensitive area, due to potential to gather and disseminate large amounts of personal data quickly over internet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Adapting interviews and instructions

A

Need to consider whether any instructions given to users need to be adapted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Adapting time allowances

A

Users with disabilities often require more time to complete an evaluation than users without disabilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Adapting physical arrangements

A

Need to consider whether the setting needs to be adapted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

When a helper, interpreter or advocate may be needed

A

Times when helper or user advocate needed to work alongside the participant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Evaluating with children

A

Increasingly, children included in design and evaluation of interactive products, and each evaluation must be adapted to them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Informed consent

A

About protecting rights of all parties involved in an evaluation by providing them with all the info they need to decide whether or not to participate

Need agreement of participant to take part in evaluation, need their permission to record evaluation, need their permission to use and store data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

UK data protection act

A

controls how personal info used by organisation, businesses or government

Everyone responsible for using data has to follow strict rues called ‘data protection principles’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Any data given must -

A

Be used fairly and lawfully
used for limited, specifically stated purposes
used in a way that is adequate, relevant and not excessive
Accurate
Kept for no longer than absolutely necessary
handled according to people’s data protections rights
Kept safe and secure
Not transferred outside European economic area without adequate protection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

From data to info

A

Any evaluation will produce data of some kind

Whatever nature of data is, purpose of analysing and interpreting data is to transform it into info relevant and useful to design process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Making sense of data

A

Analysis, interpretation, presentation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Analysis

A

Follow three steps -

  1. Collating the data - Gathering all data collected and organising it for processing
  2. Analysing and summarising data - Extracting patterns or other observations from collated data. These patterns are first step of making sense of data
  3. Reviewing the data - Accessing whether usability and user experience goals for interactive product have been met
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Interpretation

A

Data interpretation in evaluation is the process of actively considering what caused the problems that have been identified, and what to do about them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Three steps to interpretation

A
  1. Finding causes for usability problems that have been identified during analysis and rating seriousness of each
  2. Prioritising issues
  3. Proposing changes to design or making other recommendations to address these problems
25
Presentation
Best way of presenting data needs to be decided Through simple presentations often sufficient in iterative design process, when evaluations inform design as it progresses, sometimes stronger evidence needed
26
Working with Quantitative data
Three groups of methods for summarising data - 1. Tabulations, charts and rankings 2. Descriptive statistics 3. Inferential statistics
27
Tabulations, charts and rankings
Provide visual representation of your data
28
Descriptive statistics
Such as mean, median and mode that describe data you have obtained
29
Inferential statistics
Tests of statistical significance that give the probability that a claim arising from your data can be applied to your user population as a whole
30
Analysing and interpreting quantitative data
if you include descriptive statistics in your report, consider whether they are generalisable
31
Representing quantitative data visually
Important to think about, and choose, most appropriate data display for your results, and for whom your presenting your data
32
Working with qualitative data
Three simple forms of qualitative analysis 1. identifying recurring patterns or themes 2. Categorising data 3. Analysing critical incidents
33
Repeatability
Test of rigorous qualitative analysis. Can patterns or categories that emerge be articulated clearly enough for independent evaluator to apply with same result
34
Interpreting outcomes of data analysis
Outcome of data analysis should be list of usability problems found during evaluation
35
Usability defect
``` Irritates/confuses user Makes system hard to learn/install/use Causes mental overload of user Causes poor user performance Violates design standards on guidelines Reduces trust in the system Tends to cause repeat errors Could make product hard to market ```
36
Identifying causes of usability defects
Need to look in greater depth at various sources of evaluation data you have collected
37
Prioritising issues
You can assign severity ratings Can prioritise the list by assigning severity ratings to each defect
38
Proposing changes, making recommendations
Recommendations likely to contain several points - Successes to build upon Defects to fix Possible defects or successes not proven
39
Presenting findings
Key is to provide useful, well-founded info concisely and accurately, in a form appropriate to audience
40
Planning an evaluation involving users
``` Determine goals and questions Choose approach and methods Plan data collection Address practical issues Consider any ethical issues Plan how to analyse, interpret and present data Assemble materials needed to support evaluation Pilot studies ```
41
Determine goals and questions
Before evaluation be clear why you are doing it. What questions need to be answered? What sort of info do you need to answer question?
42
Choose approach and methods
Choose appropriate evaluation approaches and methods to answer specific questions identified
43
Plan data collection
What data for you need to answer evaluation questions? Is it feasible to collect that data? Usability goals typically addressed by examining user behaviour, which can be captured qualitatively or quantitatively
44
Address practical issues
Who, What , Where ,how (constraints)
45
Users (who)
What is profile of intended user? How would you characterise them?
46
Task (what)
What tasks, and how many, do you want to evaluate? Why have you chosen these tasks?
47
Different task you could include
- Core tasks frequently performed by user - Tasks that have some new design features or functionality added - Critical tasks, even though they may not be frequently used - Task you feel has to be validated with users for greater clarity and understanding of the design team
48
Setting (where)
Where will evaluation take place? How will you set up and lay out the session?
49
Equipment (How)
Do you need any equipment?
50
Constraints
Are there any practical constraints?
51
Consider any ethical issues
What are ethical considerations of the evaluation? (Task, users, location, data collection)
52
Plan how to analyse, interpret and present data
In order to collect data effectively, you must understand what you mean to do with it, and how you mean to analyse it Always plan how you will analyse and interpret data before you conduct and evaluation
53
Assemble materials needed to support evaluation
- Evaluation script - Helps to have detailed script of everything you will say to participants - Introductory and background info for participant - Informed consent - A task description - Data collection forms - Post session interview plan or questionnaire - Analysis plan
54
Pilot studies
Often appropriate to run pilot studies to ensure choices made are practicable Pilot study is a small-scale trial to check evaluation
55
Evaluator/observer bias
Participants behaviour can be altered just by observer watching them
56
Evaluator/observer bias
Participants behaviour can be altered just by observer watching them
57
Methodology (bias)
Way evaluation designed and conducted can introduce bias Should counterbalance the order in which users do tasks
58
Reporting and analysis
Analysis undertaken and results reported will be influenced to some extent by choice of evaluator
59
Cognitive walkthrough
- Will user know what to do? - Will users see how to do it? - Will users understand from the feedback whether action was correct or not?