Block 3 - Unit 3: Understanding users. Flashcards
What is cognition?
‘What goes on in our head when we carry out everyday activities.’
2 general modes of cognition. (sentence and examples)
Experimental - state of mind in which we perceive, act, and react to events effectively and effortlessly.
Requires a level of expertise and engagement. Eg. driving, reading.
Reflective cognition - thinking, comparing, decision-making. Leads to new ideas and creativity. Eg. designing, learning, writing a book.
Each requires different kinds of tech support.
Kinds of process that cognition can be described by. (6)
Attention.
Perception and recognition.
Memory.
Learning.
Reading, speaking and listening.
Problem-solving, planning, reasoning, decision-making.
(Several may be needed for a given activity).
Attention?
The process of selecting things to concentrate on.
Auditory - eg. waiting for name to be called at the doctors.
Visual - eg. scanning football results for your team’s.
Attention allows us to focus on info relevant to what we’re doing.
What factors affect how easy the process of attention is? (2)
(i) Our goals.
If we know exactly what we want to find out we try to match this with the info available.
If goals are not clear, attention can drift to interesting items, eg. news articles.
(ii) Info presentation.
Easier to find info if well structured, eg. into columns of meaningful categories, even if info density is the same.
Design implications of attention. (4)
Make info salient when it needs attending to at a given stage of a task.
Use eg. animation, colour, underlining, ordering, spacing, etc. to achieve this.
Avoid cluttering with too much (distracting / annoying) info.
Use simple search and form fill-ins - can find quickly.
Perception.
How info is acquired from the environment, via different senses, and transformed into experiences of objects, events, sounds and tastes.
Complex - involves other cognitive processes (memory, attention, language).
Design implications of perception.
Representation of info needs to be designed to be perceptible / recognisable across different media:
- Icons / graphics should enable users to readily distinguish their meaning.
- Bordering / spacing are effective visual ways to group info so it’s easy to perceive and locate items.
- Sounds should be audible and distinguishable so it is clear what they represent (including speech).
- Text should be legible and contrast to background.
Memory.
Versatile - recall different types of knowledge.
Things are filtered to avoid overloading, but can forget what we might not want to.
Filtering process - decide which info is attended to and how it is interpreted.
The more attention paid, the more likely something is to be remembered.
Processing info - reflecting, exercises, discussions, etc. helps.
Context of encoding.
May be difficult to eg. recognise someone out of context.
Recognition and recall.
Easier to recognise than recall, eg. menu over command.
Visual cues are easier to recognise than eg. phone numbers.
Personal info management (PIM) - storing files etc.
Recall-directed - memorised info about file.
Recognition-based scanning - look down list.
Should be able to narrow search by what you do remember, eg. time-stamps, flagging, type; search with partial name.
Problem with 7 +/- 2.
Theory - only 7 (+/- 2) chunks of information can be held in short term memory.
Designers may limit menu items, tabs, bullet points, etc. to 7, but this is not necessary as they don’t need to be recalled, but simply recognised from scanning.
Memory load.
Can be high, eg security checks for phone banking.
Design implications of memory.
Don’t overload users’ memories with complicate procedures for carrying out tasks.
Design interfaces that promote ‘recognition’ rather than ‘recall’ - menus, icons, consistently placed objects.
Provide a variety of ways of encoding digital info, (eg. files, emails, images), to help remember where they’ve stored them, through use of categories, colour, flagging, time-stamping, icons, etc.
Learning.
Can be considered in terms of:
(i) how to use a computer based app, or
(ii) using a computer-based app to understand a given topic.
People find it hard to learn from instructions in a manual - prefer to ‘learn by doing’.
GUI and DM interfaces support active learning - exploratory interaction, allowing users to ‘undo’ actions.
‘Training-wheels’ approach.
Restrict possible functions for novices so initial learning is easier, then extend with experience.
Dynalinking.
Abstract representations, eg. diagrams, are linked together with a more concrete illustration of what they stand for, such as simulation.
Changes in one are matched in the other - gives a better understanding of abstraction. Eg. PondWorld - ecological concepts.
Evolutionary vs Revolutionary upgrading.
Where and how to place new functions?
- Keep same structure and add more buttons / menu options - no new conceptual model to learn, but can get overloaded.
- Design a new model of interaction better suited to organising and categorising increased function set.
Is relearning relative to gains from new functionality acceptable to user?
Design implications of learning.
Design interfaces that encourage exploration.
Design interfaces than constrain an guide users to select appropriate actions when initially learning.
Dynamically link concrete representations and abstract concepts to facilitate learning of complex material.
Reading, speaking and listening.
Forms of language processing that have similar and different properties.
Meaning of sentences / phrases is the same, but, the ease with which people can read, listen or speak differs depending on the person, task and context.
Eg. many find listening easier than reading.
Differences between reading, speaking and listening. (5)
Written language is permanent.
Reading can be quicker - can scan rapidly.
Listening requires less cognitive effort.
Written language tends to be grammatical, while spoken language is often ungrammatical, eg. stop midsentence to allow others to speak.
Marked differences between people’s ability / preference of language use.
Eg. hard for dyslexics to understand / recognise written words - hard to write grammatical sentences and spell correctly.
Ways apps capitalise on skills, or support / replace where people lack skill. (6)
Interactive books / web material to help reading / learning of foreign languages.
Speech-recognition systems.
Speech-output systems.
Natural-language systems - allow users to type questions and give text responses.
Customised I/O devices.
Interactive techniques to allow blind people to read graphs etc. on the web through auditory navigation and tactile diagrams.
Design implications of reading, speaking and listening. (3)
Keep length of speech-based menus and instructions to a minimum 3 - 4 options for spoken menus.
Accentuate the intonation of artificial speech - harder to understand than human voices.
Provide options for larger text, without affecting formatting, for people who find small text hard.
Problem-solving, planning, reasoning and decision making.
All involve reflective cognition.
Include thinking about what to do, the options and consequences.
Often involve conscious processes, discussion with others (or self) and use of artifacts - maps, books, pend and paper, etc.