The prisoners and social dilemmas Flashcards Preview

C82COG - Thinking > The prisoners and social dilemmas > Flashcards

Flashcards in The prisoners and social dilemmas Deck (63):
1

Describe the ultimatum game.

Two players decide how to split a sum of money - the proposer is endowed with £10 but has to share it with a responder. The proposer suggests a sum to the responder who must either accept/reject the offer. If the responder rejects the offer both players lose all, no discussion is allowed and both players are anonymous.

2

What does RCT suggest players of the ultimatum game should do?

The proposer should offer the minimum amount and the responder should always accept, thus maximising utility for both..

3

What is the problem with RCT's ultimatum game strategy?

The median offer is normally around 40-50% and offers below 20% are often rejected.

4

What does the ultimatum game demonstrate?

That people can be more altruistic but also more vindictive than we expect (see Camerer (2003) for a review)

5

Describe the dictator game.

The same situation as the ultimatum game, except that the responder is passive and cannot reject the offer.

6

What is the dictator game a behavioural model of?

Charitable giving.

7

According to a narrow view of RCT, what strategy should the proposer follow in the dictator game?

They should never play and should always keep all the endowment.

8

What did Kahneman, Knettsch & Thaler (1986) do?

Gave proposers the option of offering a 50:50% or 90:10% split (in the dictator game) with an anonymous responder.

9

What did Kahneman, Knettsch & Thaler (1986) find?

76% of participants chose to split the money equally,

10

What did Kahneman, Knettsch & Thaler (1986) find when the dictator game was played again and past behaviour of the responders known to the proposers?

Altruistic punishment - punishment for the previously unequal proposers.

11

Who first described the prisoners’ dilemma?

Von Neumann and Morgenstern.

12

Describe the situation of the prisoners' dilemma.

Two people are suspected of a crime but there's insufficient evidence so a confession is necessary. They’re held in separate cells and given the option of confession (cooperating) or defecting (blaming the other).

13

Why does the prisoners' dilemma arise?

Because whatever the other one does it is optimal for each one to cooperate.

14

What happens in each possible situation for the prisoners' dilemma?

- Both defect: 5 years each
- One defects and the other co-operates: one is free, the other gets 10years
- Both co-operate: 1 year each

15

What do social dilemmas involve?

A decision in which there's a trade-off between one's own interests (to defect) and the interests of the group.

16

What does individual rationality in social dilemmas lead to?

Collective irrationality i.e. having to forfeit utility for the common good.

17

Is defection more common in humans or animals?

Humans.

18

Give examples of areas in life where humans often defect.

Recycling, voting, littering, eating fish, tube escalators, car sharing/public transport, MMR vaccines, low energy light bulbs, water conservation, population growth, and governments' refusal to reduce greenhouse gas emissions.

19

What proof do social dilemmas provide of humans being uniquely irrational?

We often defect, unlike other animals.

20

Give examples of situations in which animals cooperate, imposing a cost on themselves for the benefit of others.

- Blood sharing in vampire bats
- Coalitions in primate troops
- Meer cat lookouts
- Shoaling fish

21

Apply the prisoners' dilemma to countries and nuclear arms. What are the outcomes of each possibility?

- Both develop nuclear arms: moderate risk for both
- One does and one doesn’t: high for the one who doesn’t, none for the one who does
- Neither develop nuclear arms: low for both

22

What explanations have been offered for why people do what they do?

Empathy (Krebs, 1975), fairness (Hershey et al., 1994), envy (Messick, 1985), and greed (Dawes et al., 1986).

23

What did Krebs (1975) do?

Asked participants to observe a stooge taking part in a gambling experiment in which they either won money or received an electric shock.

24

What did Krebs (1975) find?

Participants exhibited higher GSR and heart rate when the stooge was perceived as similar ( as determined by answers to questionnaire on opinions) than dissimilar.
Also when given an opportunity to share their own reward with stooges who had done badly, more shared with similar stooges.

25

What do Krebs' (1975) results imply?

That we behave the way we do because we have an empathetic relationship to people we see/know/perceive as similar.

26

What did Hershey et al. (1994) do?

Examined participants' willingness to receive vaccinations against a hypothetical illnesses. The vaccinations either provide immunity to transmission but not acquisition or alleviate symptoms but don't prevent transmission.

27

What did Hershey et al. (1994) find?

Participants' willingness to take the herd immunity option increased with the proportion of other people taking the vaccine.

28

What can be concluded from Hershey et al. (1994)?

People often make a comparison between their own behaviour and the behaviour of others in a group - 'if everyone is recycling then it would be unfair of me not to’. The corollary of this is that we often exhibit peevish behaviour to prevent others from doing better than ourselves.

29

What did Messick (1985) find?

That in an iterated game participants often choose to defect in order to avoid falling too far behind others and prevent the others from doing better than them. This occurs despite the participants' awareness that they will perform worse than necessary.

30

What did Dawes et al. (1986) do?

Gave participants a voucher that could be exchanged for $5 that could either be kept or donated to a pool. If a certain number donated their $5, they’d each receive $10.

31

What were Dawes et al. (1986)'s two conditions?

1. Money back guarantee - if too few donated they'd get their money back (no need to fear)
2. Enforced contribution - non-contributors would lose their $5+proceeds share (greed not a basis for defection).

32

What did Dawes et al. (1986) find?

Only enforced contribution increased cooperation, and the money-back condition didn’t increase cooperation relative to the control group (50%).

33

What can be concluded from Dawes et al. (1986)'s findings?

Greed is often the basis for defection.

34

What game is one we're 'always engaged in'?

The iterated prisoners’ dilemma - it's more similar to real life because it's iterated, as decisions in real life are.

35

What's important about the iterated form of the prisoners' dilemma?

Evolution (selfish gene theory) prima facie suggests we should look after their own backs and always defect.

36

What is the difference in rational decision making between the single and iterated prisoners' dilemma?

- Single games have a rational decision: always defect, since it's a dominating strategy.
- With iterated defection isn't optimal as the ‘irrational’ decision of mutual cooperation produces net gain for both.

37

What causes the “Problem of Suboptimisation”?

The fact that cooperation is 'irrational' but produces a net gain in iterated PD.

38

What were Axelrod & Hamilton (1981) interested in?

Political relationships and reproductive strategies in nature.

39

What did Axelrod & Hamilton (1981) want to study?

The nature of cooperation amongst nations.

40

What did Axelrod & Hamilton (1981) want to explain?

The evolution of cooperating species from an inherently selfish gene pool.

41

What did Axelrod & Hamilton (1981) do?

Used the Prisoners' Dilemma as a model to explain cooperation. Set up a computer tournament for strategies - Rappoport, a social psychologist, won.

42

What did Axelrod & Hamilton (1981) find?

Cooperation evolves as a stable strategy, thus ensuring that gene level selection leads to group level selection (hence reproduction).

43

What are the potential strategies for the iterative Prisoners' Dilemma game?

Free rider, always cooperate, tit for tat and suspicious tit for tat.

44

Describe the free rider strategy.

- Always choose to defect no matter what the opponent’s last turn was
- A dominant strategy against an opponent who has the tendency to cooperate

45

Describe the always cooperate strategy.

- Always choose to cooperate no matter what the opponent’s last turn was
- This strategy can be horribly abused by the Free Rider strategy, or any other than tends towards defection

46

Describe the tit for tat strategy.

- The action chosen is based on the opponent’s last move
- On the first turn, the previous move isn’t known, so always cooperate
- Thereafter, always choose the opponent’s last move, as your next

47

What are the features of the tit for tat strategy?

- Nice: it cooperates on the first move
- Regulatory: it punishes defection with defection
- Forgiving: it continues cooperation after cooperation by the opponent
- Clear: it’s easy for opponent to guess the next move, so mutual benefit is clear to attain

48

Describe the suspicious tit for tat strategy.

- Always defect on the first move, thereafter replicate opponent’s move

49

What were Axelrod’s tournaments (1980s)?

Professional game theorists were invited by Axelrod to submit their own programs for playing the iterated Prisoner’s Dilemma game. Each strategy played every other, a clone of itself, and a strategy that cooperated and defected at random hundreds of times.

50

What strategy won Axelrod's first tournament?

Tit for tat won the first tournament (also won a second, despite entries having been told it had won).

51

How can Prisoners' Dilemma strategies explain survival and the evolution of cooperation?

They can be applied to a hypothetical situation in which there's one large group of the same species, interaction takes the form of the Prisoners' Dilemma, and every organism can remember the outcomes of its interactions with other organisms.

52

What do PD strategy simulations of hypothetical species situations show?

That tit for tat is the most likely to survive and therefore evolve.

53

What are the key features of successful PD strategies?

- Nice: cooperates on the first round
- Provocable: will defect in retaliation
- Forgiving: will cooperate if the opposing player returns to cooperation

54

What social dilemma did Hardin (1968) write about?

Overpopulation - suggests that we should relinquish the freedom to breed. The tragedy of the commons.

55

What three factors affect cooperation?

Social values, communication and shared group identity.

56

What did McLintock and Liebrand (1988) state?

- Subjects with different social values behave differently
- Some players prefer to maximise the difference in outcomes between self and others
- Other plays prefer to redistribute the outcomes.
(How social values affect cooperation)

57

What did Dawes et al. (1977) state?

Subjects are more likely to cooperate if they’re allowed to communicate.

58

What did Kramer and Brewer (1984) state about shared group identity?

Subjects are more likely to cooperate if they’re identified as being members of the same group as other players.

59

What is the problem of free riders?

Cooperative groups are susceptible to free-riding defectors - accepting a shared resource without returning the favour on subsequent occasions.

60

What did Enquist & Leimar (1993) do?

Modelled a population of organisms who could only reproduce after an exchange of resources

61

What did Enquist & Leimar (1993) find?

Free riders were successful (cuckolded) when the coalition time (time to persuade the other player to invest) and the search time (time to find other players) were low.

62

What must have made free riding common in hunter-gatherer societies typical of most of our evolutionary history?

The size of human groups and their dispersed nature.

63

What sorts of counter strategies to free riding were evolved by cooperating humans?

- Confine cooperation to kin (kin altruism)
- Information exchange to limit free-riders (dialect)
- Impose costs on new players (dowry)
- Cheater detection mechanism (a specific heuristic, see next lecture)