Week 5-8 Flashcards

(65 cards)

1
Q

Name the 7 threats to Internal Validity

A
Bias from the assessor
Recall bias
Placebo effect
Hawthorne effect
Natural recovery/maturation
Regression to the mean
Process of treatment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do researchers manage threats to validity?

A

Control group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define: Blinding

A

means not knowing who is receiving the true intervention, and who is receiving the control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Which level of evidence reduces the potential for bias, increases internal validity and has better strength of evidence

A

The evidence further up the hierarchy of evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What type of evidence is the lowest level of evidence?

A

Expert Opinion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

If the Exposure is ‘known’ and the outcome is ‘unknown’ what type of evidence is this?

A

Cohort

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

If the Exposure is ‘unknown’ and the Outcome is ‘Known’ what type of evidence is this?

A

Case control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Name the Levels of Evidence in order of best evidence to lowest level of evidence

A
  • Clinical guidelines/summaries
  • Systematic reviews
  • RCT’s
  • Non-RCT’s
  • Cohort studies
  • Case control studies
  • Case series/time series
  • Case reports
  • Expert opinions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain: Case series

A
  • same as a case study but more than one person

- group of people usually measured before and after intervention but has no control group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain: Case-control study

A
  • used to evaluate relationships, treatments and cause of disease
  • Is retrospective (looks back in time)
  • Compares history & exposure of people who have a condition with those that don’t
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explain: Cohort study

A
  • Is prospective (looks forward in time)
  • compares the progress of people who are exposed and/or receive a particular treatment and a control group
  • epidemiological research design
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain: RCT

A
  • most rigorous design of health research to determine whether a cause-effect relationship exists
  • not always ethical/feasible (so cohort or case control might be necessary)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Explain: Systematic Review

A
  • combines more than one primary study for any research question
  • bias is minimised here
  • critically appraises studies
  • combines studies statistically (if possible) which is called a meta-analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Explain: Summaries or Guidelines

A
  • asks clinical questions
  • summarises evidence for a topic
  • provides evidence-based guidelines for practice
  • are reviews of systematic reviews
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is an intervention?

A

Anything that can have a cause and effect but usually means a treatment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define: Ethics

A

‘how we ought to live’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What are the Universal Principles of Ethics

A

Autonomy
Non-maleficence
Beneficence
Justice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Name the 2 types of Ethical Theory

A

Deontology

Teleology

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Define: Deontology

A

Intrinsic ethical absolutes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Define: Teleology

A

Relative consequentialism

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Give an example: Deontology

A

Taking a life is always wrong

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Give an example: Teleology

A

taking a life might be right or wrong depending on the circumstances

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What are the 3 main applications of Ethical Theory

A

Altruism- best consequences for others

Egoism- best consequences for an individual

Utilitarianism- best consequences for the greatest number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

When ethical issues present in health it is refer to as…

A

Bioethics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What are the principles included in the Nuremberg Code
- voluntary informed consent - absence of coercion - properly formulated experimentation - beneficence towards experiment participants
26
When was the International Ethical Guidelines for Biomedical Research Involving Human Subjects developed?
1982
27
Who developed the 'International Ethical Guidelines for Biomedical Research Involving Human Subjects'
CIOMS- Council for International Organisations of Medical Science and WHO
28
In Australia, when did ethics guidelines development begin?
1960's
29
Define: HREC
Human Research Ethics Committee
30
Define: Protocol
The research plan- how the research will be conducted in detail
31
Define: Representative Sample
means that all those who should be represented, are reflected in the sample
32
Define: Parameter
is the measure or descriptor that applies to the Population or target group
33
Define: Statistic
is a measure or descriptor that applies to the Sample group
34
Define: Random Sampling
is when all members of the population have an equal chance of selection
35
Define: Reliability
is the reproducibility of the results of a procedure or tool
36
Define: Variable
the item of interest or thing we are measuring
37
Define: Measurement
is the process of quantifying this variable (item being measured)
38
What are the 3 Error Types in measurement
Sources of error Types of error Types of reliability
39
Give an example: Sources of error
- Equipment - Patient/participant - Measurer/researcher
40
Name the 2 Types of Measurement Error
- Systematic errors | - Random errors
41
Give an example: Types of reliability errors
- test-retest reliability - Intra-rater reliability - Inter-rater reliability
42
Define: Test-retest reliability
to establish that a measurement instrument is capable of obtaining the same results with consistency
43
Define: Intra-rater reliability
the stability of data recorded by one individual across two or more trials
44
Define: Inter-rater reliability
the stability of data recorded by more than one individual in one trial
45
Define: Correlation
measures the strength of association between two variables | Is a common measure of association
46
Name the 3 types of validity
Measurement validity Internal Validity External Validity
47
Define: Measurement validity
the degree to which a test actually measures what it is meant to measure
48
(Measurement Validity) Name the 3 reasons we measure things
- Discriminate between individuals - Evaluate change in the magnitude or quality - Predict make useful and accurate predictions or diagnoses about a patient/client/group
49
Define: Validation
the degree of confidence we have in the inferences (results) we draw from test measurements
50
(Measurement Validity) Name the 4 ways you determine the type of validity
- Face validity - Content validity - Construct validity - Criterion validty
51
(Measurement Validity) Define: Face validity
People think that it measures what it is supposed to measure - lowest form of validity (a subjective assessment based on personal opinions of people giving or taking the measurement)
52
(Measurement Validity) Define: Content validity
the ability of an instrument to represent all content areas of importance in a test - usually assessed by a panel of experts
53
(Measurement Validity) Define: Construct validity
The test correlate with other tests that measure the same thing (Does the measurement correlate well with other measures of the same thing and not correlate with measures of other things)
54
(Measurement Validity) Define: Criterion validity
Definition: the measurement can be used as a substitute measure for an established, or 'gold standard' measure of the same thing Easier Definition: is when we measure using our scale against a "gold standard" scale (this is a proven scale for measuring a specific condition)
55
Describe the difference between the 4 types of validity
Face, Construct and Content validity: determined by logical argument Criterion validity: determined by direct testing
56
Explain Reliability and Validity
- A measure can be reliable but not valid | - A measure cannot be valid if it is not reliable
57
What types of research uses Measurement Validity
Aetiology studies Intervention studies Prognosis studies Diagnostic test studies
58
Define: Internal Validity
measures how much we can trust that research conclusions are correct and true
59
Define: External validity
is the extent to which the results of research can be generalised to other samples or situations (generalisability)
60
What does it mean when there is strong internal validity?
we can trust the conclusions are true
61
Threats to Internal Validity Explain: Placebo effect
is the improvement due to only experiencing an intervention or event, whether real or not
62
Threats to Internal Validity Explain: Hawthorne effect
an improvement due to participants being studied, not the intervention
63
Threats to Internal Validity Explain: Natural Maturation/Recovery
the condition improves irrespective of the treatment
64
Threats to Internal Validity Explain: Regression to the mean
patients with episodic disease present for intervention when condition is severe, but from here fluctuations are likely to be less severe
65
Threats to Internal Validity Explain: Process of treatment
- politeness and positivity | - psychological effects between healer and patient