Week 4 Flashcards

(36 cards)

1
Q

Early philosophers

A
  • First big ideas about mind and science
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Enlightenment

A
  • Growing questions about mind, mechanism, empiricism
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Early psychologists

A
  • How to be experimentalists
  • Study perception, consciousness, intelligence
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Psychoanalysts

A
  • The importance of the unconscious, inner conflict
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Behaviourists

A
  • No more mind silliness
  • Behaviour only
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Cognitive revolution

A
  • The mind is back in psychology
  • Study as information processing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Paradigm shift

A
  • Dominant schools of thought about how to study the mind scientifically have changed
  • Zeitgeists
  • Often periods of upheaval, revolution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Reproducibility

A
  • The extent to which consistent results are observed when scientific studies are repeated
  • Major demarcation between science and pseudo-science
  • Scientific claims should not gain credence by virtue of status/authority of their originator
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Science

A
  • Systematic observation
  • Ruthless peer review
  • Considers all evidence
  • Invites criticism
  • Repeated results
  • Limited claims
  • Specific terms, operational definitions
  • Engages community
  • Changes with new evidence
  • Follows evidence where it leads
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Pseudoscience

A
  • Anecdotal evidence
  • No peer review
  • Considers only positive evidence
  • Dismisses criticism
  • Non-repeatable results
  • Grandiose claims
  • Vague terms and ideas – science-y jargon
  • Isolated
  • Dogmatic and unyielding
  • Starts with a conclusion, works back to confirm
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to collect data

A
  • Generate a hypothesis
  • Is it interesting
  • Collect some data
  • Maybe the first study doesn’t work so you fix it by changing some variables
  • Repeat step 3 until you have enough studies to publish
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Stapel – 2010

A
  • Prolific Dutch Social psychologist was investigated for fraud
  • He often supplied the data to his grad students
  • His grad students working in the lab remarked that stats for different studies showed similar means and SDs
  • After investigation, admitting his fraud and found 25 published papers were based on fabricated data
  • 58 papers were retracted
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Daryll Bem

A
  • ESP study
  • Claimed people had precognition
  • Picking between pictures behind curtains
  • The study had issues of reproduction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Common bad research practices

A
  • Stopping data collection once p is smaller than .05
  • Analyse many measures but only report significant ones
  • Collect and analyse many conditions but only report significant
  • Using covariates to get significance
  • Excluding participants
  • Transforming data to get p<.05
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Open science collaboration

A
  • 100 replications of pieces of research from 3 prominent journals
  • Formed in 2011 with around 60 members
  • Grew to 270 scientists from over 50 countries
  • 97% of original studies reported significant effects
  • 36% of replications had significant effects in the same direction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Replication crisis factors – how did the crisis happen

A
  • Enormous pressure and incentive to produce many papers
  • Over-interest in counter-intuitive findings as sexy
  • Confirmation biases by researchers promoting questionable research practices
  • Lack of accountability and transparency
17
Q

Confirmation bias

A
  • Tendency to seek out information that verifies your theory (validation)
  • And not seek out information that falsifies your theory (falsification)
18
Q

Drawbacks of the replication crisis

A
  • Failed replications may be the new trend
  • It creates a culture of paranoia and moral righteousness
19
Q

Dan Gilbert – reaction to the crisis

A
  • So-called replicators are shameless little bullies and second stringers
20
Q

Power pose replication crisis

A
  • Carney, Cuddy and Yap 2010
  • As evidence has come in over these past 2+ years
  • My views have updated to reflect the evidence
  • I do not believe that power pose effects are real
21
Q

Schnall replication crisis

A
  • One of her major papers failed to be replicated
  • Schnall wrote a response commenting on damage to her career
  • Roberts commented that damage to her career was less important than the PhDs she ruined for being honest
22
Q

Increase replication

A
  • We must ensure that findings are robust and replicable
  • Direct/conceptual replications should be a part of research pipeline
23
Q

Beware P-hacking

A
  • Exploring researcher degrees of freedom to find a significant effect on
  • Changing degrees of freedom to find a significant effect
  • Implicit bias or explicit data manipulation
24
Q

Definitions of power

A
  • The probability of finding an effect in your study if the effect is real
  • The probability that a test of significance will detect a possible deviation from the null hypothesis, should such a deviation exist
  • The probability of avoiding a type 2 error
25
Boost your power
- Studies can be underpowered - Due to misunderstanding of power - Large studies are more expensive and time-consuming - We need to publish more papers, more frequently
26
How do you increase power
- Larger sample sizes
27
Open data principles
- Making data, materials and analysis available online - So that others can replicate, check and reproduce your work
28
Confirmatory/exploratory research
- Confirmatory – hypothesis testing - Exploratory – studying around the topic - Exploratory research is okay - But must not be presented as confirmatory - Should be followed by confirmatory
29
HARK-ing
- Hypothesising after results are known
30
How to conduct confirmatory research
- Decide study details a priori - Hypotheses to test, number of subjects, conditions, DVs - Only then start recruiting - Pre-register your study
31
Open science practices in teaching
- Ensure the next generation moves on from the reproducibility crisis - Teach the importance of conducting well-powered studies - Encourage critical evaluation of published studies in terms of open science practices - Open science practice leads to more reliable, reproducible science
32
Open science practices as reviewers
- Signatories will not offer comprehensive review for any manuscript that does not meet the minimum requirements - Neither will they recommend the publication - Stimuli and materials should be made publicly available - Data should be made publicly available - Documents containing details for interpreting any data files or analysis code should be made available - The location of all these files should be advertised in the manuscript and all files should be hosted by a reliable third party
33
Incentives for pursuit of new ideas
- Publications, grant income, employment, promotion, tenure, fame
34
Issues with rewarding science practices
- Poor motivators - May prompt bad practices for fame
35
Goals for reward open science practices
- Be tolerant of lower output if doing correctly - Reward food practices for pre-registered studies regardless of outcome - Reward good practices for high powered studies - More time for research
36
Solutions to the replicability crisis
- More replications - Beware of p-hacking - Boost your power - Open data, open materials, open analysis - Conduct pre-registered confirmatory studies - Incorporate open science practices in teaches - Insist on open science practice as reviewers - Reward open science practices