PSY2001 SEMESTER 2 - WEEK 9 Flashcards

(36 cards)

1
Q

whats meta-science

A

using sciences to test science

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what does replication do

A
  • protects against false positives/sampling error
  • controls for artifacts
  • addresses researcher fraud
  • test whether findings generalise to different populations
  • test same hypothesis using different procedure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

define direct replication

A

scientific attempt to recreate critical elements of og study
with same finding indicating finding accurate/reproducible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

define conceptual replication

A

tests same hypothesis with diff procedures
same result indicate finding robust to alternative research designs, operational definition, sample

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

in open science collaboration (2015), how many different study actually replicate?

A

36%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

name 2 findings that commonly do not replicate consistently

A

priming intelligence
use of spatial distance cues to prime peoples feelings of emotional closeness to their families

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

name 5 reasons for non replications

A

faking,
sloppy science,
moderators,
scientist error,
publication bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

give well-known examples of faking

A

Diederik Stapel- Dutch social psychologist: v well known, and influential. 2011, came out as Fraud, struck off and 50 paper retracted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

name 9 circles of scientific hell (sloppy science), getting worse as go down

A

limbo
overselling
post-hoc storytelling
p-value fishing
creative outliers
plagiarism
non-publication
partial publication

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

define non-publication

A

deciding not to publish ie for bad reasons, because of bad collection…

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

define partial publication

A

only publishing sections of results which you like

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what is outcome switching/p-value fishing?

A

changing outcomes of interest, depend on observed result

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

give example for p-hacking

A

taking decisions to maximise likelihood of statistically significant effect, rather than on objective, or scientific grounds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are issues with small samples

A

lack statistical power= likely finds effect
due to chance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

how commons sloppy science (John, 2012)

A

survey, 2,000 psychologists in US on their involvements in questionable research practices ie;

  • failing to report all measure/condition
  • deciding to collect more data after looking to see if results signif
  • selectively reporting study

CONCLUSION: % of respondents who have engaged in questionable research, surprisingly high

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what did Simmons (2011) find regarding flexibility in data collections being issues, regarding musical songs

A

listen to Beatles song, or Kalimba. then in a unrelated task, indicated birth date and fathers age, and used fathers age to control and found a predicted effect that those were year and a half younger after listening to beatles >Kalimba

17
Q

whats moderators

A

variables that influence nature of effect (country/culture)
identification improves research and understanding of any second generation research

18
Q

what is scientist error important to remember?

A

Doyen et al criticised Bargh research on priming effect, but actually Doyen hadn’t used correct methodology, etc

so is reminder of actual paper being correct, and criticising papers being incorrect!!!

19
Q

define publication bias

A

findings statistically signif more likely to be published than those that are not

can be for good reasons: ambiguity over reason for null finding however,

20
Q

whats file drawer problem

A

published studies represent 5% of findings occurring by chances alone?

21
Q

define open science

A

process of making science and process of producing evidence and claims transparent and accessible to others

22
Q

name 6 principles of open science

A

data
source
access
methodology
peer review
educational resources

23
Q

define open methodology

A

documenting method, process how method developed/decided for

24
Q

define pre registration

A

define RQ, methods, approach to analysis before observing research outcomes = prevent HARKing

25
whats HARKing
hypothesising after result known = hindsight bias
26
what does pre-registration involve
have to register where it is data stamped, meaning it’s clear when new info was included (however, potential may constrain science) make clear what is exploratory, and nots
27
does pre-registrations improve replicability (name stats)
replicated expected effects in 86% of attempts contrast with open science collaboration (2015) who found only 23% of findings in social psychology could be replicated
28
what 2 stages are used in registered report peer review processes
1. reviewer/editor assess detailed protocol (rationale, procedure, analysis plan) 2. following favourable reviews journal off acceptance - garuntees publications
29
define open source materials and code
use open source technology (soft, hardware) and open own technology- ie, codes programming questionnaire/experiment
30
define open data
making dataset freely available: - allows other scientist verify original analysis - facilitate research beyond scope of original research - avoid duplication of data collection data needs to be FAIR
31
what is FAIR data?
findable, accessible, interoperable, reusable
32
define gold open access
- researchers (or funder/host institution) pay journal to publish article - final formatted version freely, permanent accessible for everyone
33
define green open access
‘self-archiving’, put unformatted version of manuscript into repository
34
name effects of open access publicating
- open access works are used more - cited between 36%-600% more than works that are not open access - outside of academia, open access works are given more coverage by journalists and discussed more in non-scientific settings - open access works facilitate meta-research - enable use of automated text, and data mining tool research funded by tax - therefore public owns research???
35
what 3 badges may be awarded to paper for good science
preregistered, open data, open materials
36
name 8 transparency/openness promotive terms
- citation standards - data transparency - analytic methods (code) transparency - research materials transparency - design and analysis transparency - preregistration of studies - preregistration of analysis plans - replication