Kennipart 2 Flashcards

(27 cards)

1
Q

Judgment and decision definition

A

Calculating likelihood of events using the incomplete inform, calculating probabilities. Different from a decision- actively choosing from a number of possible actions. Judgement is calc the probabilities of each option, don’t know exact likelihood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Disease diagnosis terms

A

Hit- + result if you actually have the disease. Some tests are more accurate than others: low or high sensitivity. Miss is a - result if you have the disease. Correct rejection is a - result when you don’t have the disease. Specific if you have correct rejection more than false alarm. False alarm is + result if you don’t have the disease. Trade off between sensitive and specificity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Base rate prevalence

A

E.g. if 50% of pop has disease but test is 95% accurate, 5% false alarms and 5% misses. If 1% have disease is 95% accurate, one has the hint, most have correct rejection but several have false alarm. + doesn’t mean disease as low prevalence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Bayes theorem - definitions

A

What rational person should do with information. Probability of event based on current info and priori beliefs.- update beliefs based on new evidence and context/base rate should affect. Priori is the probability of having base rate/ disease/ pH is prob hyp is correct- belief before evidence. Likelihood is the probability of test result being +/- given that you have the disease- pDH is probability of data given the hypothesis . Posterior is the belief you have the disease given the test result. pHD probability of hyp given the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Equations - look them up

A

Odds ratio is prob of having disease vs not having it. E.g. if 1/100 have disease (base rate), 1 has it, 99 don’t so prior is 1/99. Likelihood: if test 95% sensitive, + if you do have disease it it’s 0.95, FA 5% so 0.95 / 0.05=19. To get posterior multiply prior and likelihood so 1/99 x 19 is 16%. If result -, CR 0.95, miss 0.05 so likely still 19. Posterior is 19 x 99/1=1881: - result more trusted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Base rate neglect

A

Kahneman tversky 73: out of 100 people, told 70 lawyers and 30 engineers, or the other way around. Then told about a man, who is married,m conservative, not political, likes numbers and asked if he was engineer or lawyer, 90% said engineer regardless of split. But surely if told 0 were engineers, no one would say so there’s a limit. Cascells 78: had medical students or staff from Harvard, asked if a test for a disease w 1/1000 prevalence has false alarm of 5%, what is the chance a + actually means disease. Answer is 2% but 45% said 95%, ignored the prevalence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Conjunction fallacy

A

Taversky and kannneman 83: told about linda, bright, outspoken, goes to protests asked if she was a bank teller or a bank teller in feminist movement. Many chose 2 even tho evidence, should chose 1 as higher chance of just being a bank teller. Conjunction fallacy is assumption probability of 2 events is higher than 1. Ppl value specificity e.g. like statement more bc it’s specific. We may ignore redundant info e..g as bank teller is in both

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Optimal judgment

A

Human judgment not optimal due to BRN AND CF but including info about cause improves performance- krynski and tenebaum e.g. chance a and b results mixed up.
Resenting frequencies improves e.g. 1 out of 20. Personal relevance increases e.g. do you have disease x. Bus these still don’t make it 200%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Heuristics

A

Representative: assume object belongs to a category as representative e.g. bank teller. Availability is freq of event estimated by ease of retrieval . Liechtenstein 78: publicised causes of death estimated to be higher than not e.g. murder over suicide. Pachir 2012: 3 drivers: direct experiences, emotional response and media coverage. Can be reversed oppenheimer 2004:when asked which surname most common- famous or not, say not famous. W: vaguely defined, doesn’t define when spec ones used and not biased but poor information, list of heuristics doesn’t mean theory’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Theories

A

Dual process theory- kahneman: first is fast and automatic but 2 is slow and effortful. 1 is heuristic, prone to errors. Cog misers, use 2 to get correct answer but often use 1 as easier. 2 also has errors, doesn’t explain why, ill defined, hard to predict when used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Fast and frugal- gigerenzer

A

How are we so smart in real life. Use heuristics as pretty accurate and don’t have time. Take the best and ignore the rest- ask which city is bigger, use search rule e.g. do I know this city, then other cues like airport. Stopping rule as of know 1, stop search. Decision rule: chose the one you know. Search is serial, once find cue, stop. Recognition heuristic, select object you know

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Goldstein and gigerenzer 2002

A

Us students chosose between 2 German cities , chose the one they recognised 90%. Pachir 2012, recognition used for which city bigger but not which is further north, over all questions, correlation between validity and usage 0.64. Richtner and spath 2006: German students chose city based on recogniton 98% of the time when it had international airport , even when don’t know the cities.
Chose 82% when other city had international airport, even if didn’t know it. But not always used, only when info valid, no definition of validity. Don’t know if serial processing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Tutorial

A

EVR had reduced medial pfc damage, normal iq but personality change as acted inappropriately and couldn’t make decisions. Somatic marker hypothesis is that we unconc simulate all outcomes of decisions, body reacts which causes gut feeling/body markers and allocates attention. Rough way but not always right. In the medial pfc, ps w damage have no scr response to emotional stim but do have orienting response. Iowa gambling task- decks good or bad (high risk or low) normal switch to low but damaged don’t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Utility theory

A

Assume people act rationally to max expected utility= p(given outcome) x utliilty of outcome. U is the subjective value we attach- irrational. We calc u for each outcome and choose greatest. Starting point irrelevant, emphasis on final amount . E.g. people choose safe option like 100% win 500 over 50/50 win loose even if more

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Prospect theory

A

Better descriptive account than utility. Kahneman and tversky 84: assume people id reference point, rep current state with starting, people more sensitive to potential losses than gains (loss aversion), ppl overweight rare events e.g. head is 200 tail is -100 ppl refuse. You can calc a blue function, link between until its and objective value- morenloos averse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Framing effect

A

Tversky and kahneman 81: us preparing for Asian disease, gain frame is 200ppl saved or 1 in 3 chance 600 saved- most choose first even tho both same number in end. Loss frame- 400 will die or 1 in 3 prob no on e will and 2 in 3 600 will- ppl chose second as loss Evers and overweight that no one will die

17
Q

Framing effect weaknesses

A

Mandel and vartanian 2011: ambiguity as ppl may think at least 200 saved, when progra,s described, effect disappears. Almashat 2008: when doing medical scenarios w cancer, when ps list s/w and justify, framing goes (possible 2 systems where you deliberately weight?). Still here in world e.g. reducing voting age got 37% but giving young ppl the vote got 52%

18
Q

Sunk cost effect

A

Ppl pursue goal after being shown to be suboptimal as resources invested. Dawes 88: deposit for hotel, you get ill but go anahway. But w: balints and Ely 2011: when gave mba info about investments, they showed opposite of sunk cost e.g. more likely to switch

19
Q

Overweighting rare event

A

Explains lottery. Hertwig 2004: when based on descriptions of real world, hen based on past experiences, underweight. Due to low sampling e.g. never experienced rare event, link to availability heuristic

20
Q

Loss neutrality

A

Either 50% 1, 50% -1 or 50% 5, 50% -5. Prospect thoery predicts loss aversion so should pick one but actually there’s a 5050split, breaks down with extreme amounts tho

21
Q

Utility vs prospect theory

A

P more detailed, both assume value function, non linear mapping but prospect make more assumptions. P predicts wider range of outcomes due to loss aversion, overweighting, reference point. W: don’t explain why value function exists, effects can be reversed, no ind diffs e.g. high self esteem prefer risk, high narc sensitive to reward, low to loss

22
Q

Emotional and social factors

A

Anticipated loss leads to - emotions. Kernel 2006: -3 loss greater impact than +5 gain, impact bias as overestimate intensity of - emotions due to loss. Giorgetta 2013: decisions make by p or computer felt regret if made by p but disappointment if by computer

23
Q

Agency and omission bias

A

Prefer to do nothing instead of taking risk. Brown 2010: parents more willing to accept disease than risk of harming child via vaccine. Wroe 2005: high anticipated responsibility and regret if child has reaction to vaccine. Aberegg 2005: doctor treating lung disease, less likely to choose best plan when option to do nothing (40%) than when not given option 50%

24
Q

Status quo bias and accountability

A

Sqb: prefer to accept than change decision-sameulson and zeclhauser 88: recited keep same allocation of pension dispute no cost to changing- although takes effort. A: Simon son and staw 92: increased A, increased sunk cost, greater need to justify og decision so stick with it for longer

25
Complex decision making what we should do in real life
Lab normally pick in between 2 but diff in real world. Normatively, ppl should: id attributes relevant, decide how to weight them, list all options under consideration, rate each on each attribute, obtain total utility and select option with highest. Unlikely as Simon 57: bounded rationality by environmental (information costs) and cog contradicts (stim) rational within these
26
What actually happens in complex decision making
Satisficing: choosing 1st that satisfies minimum requirements- perfection th enemy of good enough. Elimination by aspects theory - tversky 72:: serial elimination based on criteria until 1 but can’t explain trade offs. Kaplan 2011: 2 stages: elimination then detailed comparison of remaining. Elimination by aspects thoery-galotti 2007: us choosing major focus on multiple at a time, decrease at a time, constrained info focus on less options over time but attributes constant over time- more education means more attributes
27
Memory guided decision making
Kenlin 98: use past experiences to make quick decisions. Recognition primed decision model- experts retrieve past situations and eval if appropriate, quicker and role of expertise but elimination by aspects used by non experts