Block 4 - Unit 3: Analytical evaluation Flashcards Preview

M364 Revision > Block 4 - Unit 3: Analytical evaluation > Flashcards

Flashcards in Block 4 - Unit 3: Analytical evaluation Deck (44)
Loading flashcards...

Analytical evaluation - key point and examples.

Don't involve users - experts role-play as users, and models predict users' performance.

Inspection methods - heuristic evaluation and walkthroughs.

3 models - GOMS, keystroke level model and Fitt's law.



Generic name for a set of techniques involving experts, or a combination of experts and users, examining a product to predict how usable it is.

Checks whether interface complies with a set of standards, guidelines or design principles.


Heuristic evaluation?

An inspection technique in which experts, guided by a set of usability principles (heuristics), evaluate whether user interface elements (menus, dialog boxes, etc.) conform to the principles.

Heuristics closely resemble high-level design principles and guidelines, eg. consistent designs, reduce memory load, etc.


Advantage- of heuristic evaluations.

Sometimes users are not easily accessible, or would involve too much cost / time.

Can be used at any stage of design project, including early on before well-developed prototypes are available.


Revised (2006) set of heuristics. (10)

Visibility of system status.

Match between system and real world.

User control and freedom.

Consistency and standards.

Error prevention.

Recognition rather than recall.

Flexibility and efficiency of use.

Aesthetic and minimalist design.

Help users recognise, diagnose and recover from errors.

Help and documentation.


Visibility of system status (heuristic).

Keep users informed through appropriate feedback within reasonable time.


Match between system and real world (heuristic).

Speak users' language - words/phrases/concepts familiar to the user (rather than system oriented).

Follow real-world conventions, making ifo appear in a natural and logical order.


User control and freedom (heuristic).

Users often choose system functions in error - need clearly marked 'emergency exit' to leave unwanted state without extended dialog.

Support undo and redo.


Consistency and standards (heuristic).

Users shouldn't have to wonder whether different words, situations or actions mean the same thing.

Follow platform conventions.


Error prevention (heuristic).

Better than good error messages.

Either eliminate error-prone conditions or check for them and present users with a confirmation option before commit.


Recognition rather than recall (heuristic).

Minimise memory load - make objects, actions and options visible.

Shouldn't need to remember info from one part of dialog to another.

Instructions for system use should be visibile or easily retrievable when appropriate.


Flexibility and efficiency of use (heuristic).

Accelerators - unseen by novice user - can speed up interaction for the expert user, hence cater to different experience levels.

Allow users to tailor frequent actions.


Aesthetic and minimalist design (heuristic).

Dialogs shouldn't contain info that's irrelevant or rarely needed.

All extra info competes with relevant info - diminished relative visability.


Help users recognise, diagnose and recover from errors (heuristic).

Error message should b expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.


Help and documentation (heuristic).

Although better if system can be used without documentation, it may still be necessary to provide.

Any such info should be easy to search, focused on user's task, list concrete steps to be carried out and not be too large.


3 stages of a heuristic evaluation.

1. Briefing session.

2. Evaluation period.

3. Debriefing session.


Briefing session (heuristic evaluation).

Planning is necessary - script, choose tasks / areas of focus and experts.

Ideally expert evaluators will be usability experts, but could choose domain experts or designers with extensive design experience.

Script guides evaluation an ensures consistent briefing.


Approach experts may be asked to take for a heuristic evaluation. (3)

Set of tasks developed in advance for experts to try.

Expert asked to check each task and task sequence against the whole list of heuristics.

Experts asked to focus on the assessment of particular design features, or any identified usability concerns for the product.


Choosing expert evaluators.

Rare to find an expert in ID and in the product domain - usual to find 2+ experts with different backgrounds:

- Usability experts (experienced in conducting evaluations).

- Domain experts (users or their representatives, designers, developers).

- Non-experts (may be experts in own domains).


Evaluation period of heuristic evaluation.

1 - 2 hours for each expert.

1st pass - feel for flow of interaction and product scope.

2nd pass - focus on specific interface elements in the context of the whole product; identify potential UPs.


Recording UPs in heuristic evaluation.

Data collection form:

- Location in the task description.

- Heuristic violated.

- Usability defect description.

- Expert evaluator's comments regarding the usability defect.


Debriefing session of heuristic evaluation.

Experts discuss each others' findings and differences of opinions.

Outcome - prioritised list of problems, with severity ratings, and suggested solutions.


Problems while doing heuristic evaluations.

Different approaches often identify different problems, and heuristics (alone) can miss severe problems.

May also uncover 'false' problems, based on own biases and views.
(Having several evaluators can reduce occurrence).


HOMERUN? (purpose and each letter)

Set of heuristics for evaluating websites.

Some elements (eg. N, U) target commercial / corporate sites, but others (eg. O, E) are appropriate for many sites.

High quality content. (Info / functionality users want).

Often updated. (Importance varies, eg. news, selling, archive content).

Minimal download time.

Ease of use.

Relevant to users' needs. (Carry out tasks required).

Unique to online medium. (Benefit conventional media doesn't offer - browsing, 24/7 purchase).

Net-centric corporate culture. (Company needs to put site first in most aspects of operation).



Alternative form of inspection to heuristic evaluation for predicting UPs without doing user testing.

Involve walking through a task with the system and noting UPs.

Most don't involve users, but pluralistic walkthroughs do.


Cognitive walkthroughs?

Simulate users' problem-solving process at each step in the human-computer dialog, checking to see if users' goals and memory for actions can be assumed to lead to the next correct action.

Defining feature - they focus on evaluating designs for ease of learning. This focus is motivated by observations that users learn by exploration.


Steps in cognitive walkthroughs (5)

1. Characteristics of typical users are identified and documented and sample tasks are developed that focus on the aspects of the design to be evaluated.

Description / prototype of interface produced, along with a clear sequence of the actions needed for users to complete the task.

2. A designer and 1+ expert evaluators come together to do analysis.

3. Evaluators walk through the action sequences for each task, placing it within the context of a typical scenario, as they do they try to answer:
- Will the correct action (to achieve task) be evident to user?
- Will the user notice that the correct action is available?
- Will the user associate and interpret the response from the action correctly?

4. During walkthrough, a record of critical info is compiled in which:
- Assumptions about what would cause problems and why are recorded.
- Notes about side issues and design changes are made.
- Summary of results compiled.

5. Design revised to fix problems presented.


Cognitive walkthrough vs heuristic evaluation.

Focus of walkthrough is more to identify specific users' problems at a high level of detail.

This narrow focus is useful for certain system types, but not others.
Eg. apps involving complex operations to perform tasks.

Very time-consuming and labourious; needs good understanding of the cognitive processes involved.


2 problems with original cognitive walkthrough.

Takes too long answering the 3 questions in step 3 and discussing answers.

Designers are defensive - lengthy arguments to justify design; undermines efficacy of technique and social relationships.


Variation of cognitive walkthrough (to cope with 2 problems).

Less questions and curtail discussions.

Analysis more coarse-grained, but much quicker.

Identify leader and usability specialist.

Strong rules - ban on defending design, debating cognitive theory or doing designs on the fly.

(Affect is a more usable technique; directs social interactions of the design team so they achieve goals).