Lecture 11&12, Chapter 10&11 Flashcards
Methodology or methodologism
Tendency to see methodological rigour as the only requirement for scientific research, at the expense of theory formation
Pseudoscience
Branch of knowledge that pretends to be scientific but that violates the scientific method on essential aspects, such as lack of openness to testing by others and reliance on confirmation rather than falsification
Hermeneutics
approach in psychology according to which the task of the psychologist is to interpret and understand persons on the basis of their personal and socio-cultural history
Humanistic psychology
psychological movement promoted by Rogers and Maslow as a reaction against psychoanalysis and behaviourism; stressed that people are human, inherently positive, endowed with free will and living within a socio-cultural context
Feminist psychology
movement in psychology aimed at understanding women; is particularly concerned with the way in which women are treated in mainstream psychology
Postcolonial psychology
movement in psychology addressing the issues of racism and the ways in which dominant groups treat other groups
Unconscious plagiarism
indicates how the scientific and the hermeneutic approach in psychology have influenced each other without the proponents being aware of it
Replicability
The probability of obtaining the same finding when a scientific study is rerun (in the same way)
Replication crisis
a crisis of confidence in scientific research, because many published findings cannot be repeated if studies are rerun, questioning the reliability of scientific findings
File drawer problem
issue that the scientific literature badly represents the research done because experiments that do not find significant differences are less likely to get published
Conceptual replication
replication in which an effect is investigated differently from the original study; is good to examine the generality of a finding, but can magnify biases in the scientific literature if combined with the file drawer problem
Questionable research practices
research practices undermining the statistical conclusions that can be drawn from a study; usually increase the chances of finding a predicted effect
P-hacking
Manipulating data in order to obtain a desired (significant) p-value
HARKing
unexpected significant finding in a statistical analysis is presented as an effect that was the focus of the research and, therefore, addresses an important theoretical question (hypothesising after the results are known)
Registered report
a type of research article that is evaluated by scientific journals before the data are collected; goal is to make the evaluation independent of the obtained results and solely dependent on the research question, the research design, and the proposed analyses
Bayesian statistics
data analysis that deviates from the traditional hypothesis testing with p-values; estimates the relative probabilities of the null hypothesis and the alternative hypothesis; is hoped to correct existing misunderstandings of statistics
Pottery barn rule
In science is the moral obligation of a scientific journal to publish a failure to replicate a finding previously published in the journal
Open science
science practice where all relevant information is made easily available, so that other researchers can check the findings and integrate them in their own research
Repository
in science is a location where data and analysis programs are stored, so that others can retrieve them (typically on the internet)
Transparency and Openness Promotion (TOP) guidelines
list of criteria written by advocates of open science describing the extent to which journals adhere to the standards of open and reproducible science
Secondary data analysis
Reanalysis of existing data to address new research questions
Big data
Collection and use of large datasets for secondary data analysis
Publish or perish
refers to the practice in academia that a person will not be appointed or promoted unless they have a strong portfolio of scientific publications
Peer review
in science is the evaluation of scientific work by research colleagues (peers) to decide whether the work is good enough to be published (or financed in case of grant applications)