Block 4 - Unit 3: Analytical evaluation Flashcards
(44 cards)
Analytical evaluation - key point and examples.
Don’t involve users - experts role-play as users, and models predict users’ performance.
Inspection methods - heuristic evaluation and walkthroughs.
3 models - GOMS, keystroke level model and Fitt’s law.
Inspection?
Generic name for a set of techniques involving experts, or a combination of experts and users, examining a product to predict how usable it is.
Checks whether interface complies with a set of standards, guidelines or design principles.
Heuristic evaluation?
An inspection technique in which experts, guided by a set of usability principles (heuristics), evaluate whether user interface elements (menus, dialog boxes, etc.) conform to the principles.
Heuristics closely resemble high-level design principles and guidelines, eg. consistent designs, reduce memory load, etc.
Advantage- of heuristic evaluations.
Sometimes users are not easily accessible, or would involve too much cost / time.
Can be used at any stage of design project, including early on before well-developed prototypes are available.
Revised (2006) set of heuristics. (10)
Visibility of system status.
Match between system and real world.
User control and freedom.
Consistency and standards.
Error prevention.
Recognition rather than recall.
Flexibility and efficiency of use.
Aesthetic and minimalist design.
Help users recognise, diagnose and recover from errors.
Help and documentation.
Visibility of system status (heuristic).
Keep users informed through appropriate feedback within reasonable time.
Match between system and real world (heuristic).
Speak users’ language - words/phrases/concepts familiar to the user (rather than system oriented).
Follow real-world conventions, making ifo appear in a natural and logical order.
User control and freedom (heuristic).
Users often choose system functions in error - need clearly marked ‘emergency exit’ to leave unwanted state without extended dialog.
Support undo and redo.
Consistency and standards (heuristic).
Users shouldn’t have to wonder whether different words, situations or actions mean the same thing.
Follow platform conventions.
Error prevention (heuristic).
Better than good error messages.
Either eliminate error-prone conditions or check for them and present users with a confirmation option before commit.
Recognition rather than recall (heuristic).
Minimise memory load - make objects, actions and options visible.
Shouldn’t need to remember info from one part of dialog to another.
Instructions for system use should be visibile or easily retrievable when appropriate.
Flexibility and efficiency of use (heuristic).
Accelerators - unseen by novice user - can speed up interaction for the expert user, hence cater to different experience levels.
Allow users to tailor frequent actions.
Aesthetic and minimalist design (heuristic).
Dialogs shouldn’t contain info that’s irrelevant or rarely needed.
All extra info competes with relevant info - diminished relative visability.
Help users recognise, diagnose and recover from errors (heuristic).
Error message should b expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
Help and documentation (heuristic).
Although better if system can be used without documentation, it may still be necessary to provide.
Any such info should be easy to search, focused on user’s task, list concrete steps to be carried out and not be too large.
3 stages of a heuristic evaluation.
- Briefing session.
- Evaluation period.
- Debriefing session.
Briefing session (heuristic evaluation).
Planning is necessary - script, choose tasks / areas of focus and experts.
Ideally expert evaluators will be usability experts, but could choose domain experts or designers with extensive design experience.
Script guides evaluation an ensures consistent briefing.
Approach experts may be asked to take for a heuristic evaluation. (3)
Set of tasks developed in advance for experts to try.
Expert asked to check each task and task sequence against the whole list of heuristics.
Experts asked to focus on the assessment of particular design features, or any identified usability concerns for the product.
Choosing expert evaluators.
Rare to find an expert in ID and in the product domain - usual to find 2+ experts with different backgrounds:
- Usability experts (experienced in conducting evaluations).
- Domain experts (users or their representatives, designers, developers).
- Non-experts (may be experts in own domains).
Evaluation period of heuristic evaluation.
1 - 2 hours for each expert.
1st pass - feel for flow of interaction and product scope.
2nd pass - focus on specific interface elements in the context of the whole product; identify potential UPs.
Recording UPs in heuristic evaluation.
Data collection form:
- Location in the task description.
- Heuristic violated.
- Usability defect description.
- Expert evaluator’s comments regarding the usability defect.
Debriefing session of heuristic evaluation.
Experts discuss each others’ findings and differences of opinions.
Outcome - prioritised list of problems, with severity ratings, and suggested solutions.
Problems while doing heuristic evaluations.
Different approaches often identify different problems, and heuristics (alone) can miss severe problems.
May also uncover ‘false’ problems, based on own biases and views.
(Having several evaluators can reduce occurrence).
HOMERUN? (purpose and each letter)
Set of heuristics for evaluating websites.
Some elements (eg. N, U) target commercial / corporate sites, but others (eg. O, E) are appropriate for many sites.
High quality content. (Info / functionality users want).
Often updated. (Importance varies, eg. news, selling, archive content).
Minimal download time.
Ease of use.
Relevant to users’ needs. (Carry out tasks required).
Unique to online medium. (Benefit conventional media doesn’t offer - browsing, 24/7 purchase).
Net-centric corporate culture. (Company needs to put site first in most aspects of operation).