Quiz 2 Flashcards

1
Q

Type of Inquiries in Poli Sci

A

Application of scientific methods to understanding power dynamics in politics(about resources and policies)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of Political Science

A

scientific study of politics

H. Lasswell : “who gets what, when and how”

The discipline is divided

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Characteristics of normative questions? (7)

A
  1. “should, ought”
  2. how things should be done,
  3. is more opinionated
  4. much more difficult to measure
  5. value-based
  6. the use of general principles, persuasion and logic
  7. Are a source of debate
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Characteristics of empirical questions? (6)

A
  1. factual-based
  2. observing then explaining as real as it is
  3. based on testing
  4. descriptive
  5. focused on measurement
  6. use scientific method
  7. cannot draw the same inferences due to moral differences
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What about bias?

A
  1. We need to be aware of our bias, on our method on how we answer and the qts we leave out;
  2. It is problematic if there’s no diversity of opinions among the research
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Aims of empirical analysis in Political Science (4)

A
  1. Contextual description :
    to examine and to know more about the average knowledge; not engaged in any generalization; the expert of that specific subject
  2. Classification and measurement:
    categorize things into groups; distinctions of certain types; not making cause or arguments; to understand variances 
  3. Hypothesis testing:
    it needs to be as specific as possible; the origins of hypotheses: identify a problem, look at other ppl’s researches; need to be a good observer of the world;
  4. Prediction:
    when you’re very certain about a certain event; very rare in poli sci
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why is Political Science a probabilistic science?

A

Because you can not 100% predict phenomenons

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What was the IV and DV in Michael Moore’s video?

A

IV: Marilyn Manson, bowling
DV: gun violence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Basic research?

A

when we go beyond the surface; to advance knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Applied research?

A

is focused but not in-depth to specific problems; maximize effectiveness and efficiency in the short term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Inductive research(broad)? (2)

A
  1. data to theory, progression from empirical evidence to generalization
  2. begin with an open mind
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Deductive research(narrow)? (2)

A
  1. general to specific, set out to test hypotheses and theory in the real world
  2. assumptions = logic or pre-existing research
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Hypothesis(def.)?

A
  1. statement of two variables

2. no normative statements

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Proposition?

A

a statement has to be true or false

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Characteristics of a hypothesis?(5)

A
  1. relationship
  2. comparison
  3. direction(+ or -)
  4. testability
  5. unit of analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

causality?

A

A causes B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Temporal order?

A

one event occurs in reaction to another event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Continuum?

A

Ability to classify variables that can be ordered or ranked

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

How can I classify variables?(2)

A

Ideal type

Typology : different types of things(political views : socialism, communist, capitalist)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Multivariate?

A

more than one independent variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Spurious relationship?

A

controlling/holding variable C constant causes the relationship between A and B to disappear

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Ecological fallacy?

A

not to project ecological characteristics onto single behaviours

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Intervening variables?

A

Variables that impact the causality flow/variation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Reinforcing variables?

A

a variable that strenghten the relationship between A and B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Multiple independent variables?
Assumption that independence between the causal or independent variables may not reflect the true relationship between variables in the real world.
26
How scientific is Michael Moore? (3)
he doesn't show his entire data(showing chosen excerpts and shorts one); not balance, not condensing
27
Intersubjectivity(2)? And why it is needed?
1. requires more than one observation 2. scientific process = replication cannot create knowledge w/ one person/research
28
Essence of Scientific Method?(3)
1. not about common sense or intuition, but objective observation (empiricism) 2. Impartiality 3. Intersubjectivity and Replicability
29
What researchers should do? (essence of scientific method)(2)
1. has to hold its own belief outside the research | 2. should not fear the retroaction of the public
30
DA/RT Initiative?
1. data access, research transparency and analytic transparency 2. need to take other experiences into account
31
Scientific Method Graph
RQ -> Theory -> Hypotheses, Operationalization, Research Design -> Observation -> Reformulation, Generalization, Data analysis -> cycle
32
How to formulate a good reseach question?
broadest method you can approach a project you have to be passionnate, curious about the project(suitable); it has to be feasible 1. Relevance-Importance 2. Examining todays political developmennt 3. Curious topic 4. Aware of outsiders
33
What is a theory?
It goes a step further, it is a potential explanation of a political phenomenon through logically related propositions(statements). It reveals the direction of the research question
34
How can a theory be formulated?
induction :  bottom-up approach making generalizations based on observation deduction : top-down approach starting from a theory and derive empirical implications from that theory
35
How to link a research question to a theoretical framework(theory)?(5)
look for potential problems w/ the theories used. Do we maybe see other variables the theory did not include take a famous theory and apply it to set of (new) cases- maybe new insights if there are outliers: cases that don't seem to fit the theory very well - e.g the role of Nevada (what do we learn about theory) replicate an existing theory and tests w/ a new set of measures to soak, poke and observe to find a new theory for an unexplained phenomenon
36
dummy variable
the answer is plain : yes or no
37
common errors in hypotheses(6)
statement fails to specify how the v. are related, only one v. or is vague, is incomplete or improperly specified, use tautologies, proper names and value judgement 
38
Correlation vs. Causation
Correlation : related | Causation : cause
39
What do Potential Outcomes Framework do?
helps us visualize what are the problems(many variables interfere)
40
Causal inference problems? Give examples
Reverse causality Spurious relationship(selection effects can cause it) No relationship
41
Why is an experimental design so advantageous for addressing causality?
experiments are very good at excluding other factors out of our consideration; this limitation procedure are controlled by us;
42
Difference between a test group and a control group?
Test group is exposed to the dependent variable while the control group is not
43
Types of Research Designs? (4)(not in quasi-experimental research)
Observation without control group Natural experiment without pre-measurements Natural experiment True experiment (random and equal assignment)
44
Regression to the mean?
dependant variable measurement once can show signs of errors; not constant
45
best way to eliminate a third variable that may affect the causation is?
To separate the subjects into new little groups and to do the same analysis within each groups .
46
Types of Research Designs in Quasi-experimental research?
There's no random assignment. Post-test : no comparison Post-Test with group control : one group is exposed to the IV Pre and Post-Test : compare a case to itself Pre and Post-Test with control group : Compare 2 cases Dosage Design : compare cases of different manipulation strength of the IV
47
What logic do we have to use for observational studies?
experimental logic
48
How to test causality? Steps
1. Showing a correlation 2. Excluding other factors 3. Temporal order 4. Control group (if possible) OR Randomization of assignment or equivalent (matched control -- if possible) 5. Need causal mechanism
49
Special Features of the Hersh Study that ensure causality flow : (4)
1. Subjective data = surveys and self-reports, versis documented data 2. As-if randomization through 9/11 victimhood(similar to lottery) 3. Uses per and post-data 4. Creation of a control group that is VERY similar(geographic linkage to victims and full matching)
50
As-if randomnization is? (3)
subjects do not self-select themselves into treatment and control groups assignment to treatment and control groups is plausibly uncorrelated with alternative explanations Lower internal validity than if we had truly random assignment 
51
Example of a Natural experiment
9/11 victim | Birth lottery in Vietnam
52
How often do governments use randomization?
Rarely
53
Internal validity? (2)
1. experiments are better for that validity | 2. The study is properly set up to determine if the independent v. has a causal effect on the dependent v. .
54
External validity? (2)
The results of the study can be generalized to the real world or beyond a case
55
Lab experiments: advantages(5) and weaknesses(3)
Researcher in full control Complete randomnization into treatment and control groups Good for internal validity Relatively easy to replicate, Often cheaper and less time-consuming than field experiments Artificial environment - low realism. Demand characteristics : participants are aware of the experiment, behaviour may change Experimenter effects : bias when experimenter's expectations affect their behaviour
56
Field experiments: advantages(3) and weaknesses(4)
ppl behave more naturally =  high realism easier to generalize  ppl often do not know they are being studied do not use consent; not ethical weak control of competing variables time-consuming and costly participation vary
57
Types of field experiments
canvassing experiments, civic course experiments, vote compass experiments, mock elections, evaluating programs/policies
58
Why randomized controlled experiments make it possible to isolate causal effects?
We infer causal effects from our observations.
59
fundamental problem of causal inference
we cannot observe subjects in both their treated and untreaded states b/c reality is now and time is only once
60
2 key characteristics  to experimental method
planned intervention by researcher and random assignment
61
What is the nature of observational studies?
they are passive
62
How internal and external validity are related?
The internal validity strengthens throughout the research and external validity will follow
63
Survey Experiments: advantages(3) and weaknesses(2)
``` Substantive questions(important to the study)  Can reach more people(internet)(heterogenous) Brings great generability of the results ``` May be different from real-life setting Perceptions of the subjects on issues/questions may differ from the researcher
64
Pratical and ethical limits of experimentation?
To test human nature(e.g. rationality) is unethical To belive in total control is unpractical To not give true information is unethical To violate one's ground of equity and fairness
65
Operationalization + concerns
Movement from an abstract concept to a concrete measure Concerns : potential conflicts and controversies around the measurement?
66
Concepts ?(def. + 3)
Concept is an idea or a term that enables us to classify phenomena can be concrete or abstract categorical concepts have diff. characteristics continuous concepts have sequentially connected characteristics(continuum)
67
Variables(when operationalizing)?(3)
transforming our conceptual idea into a quantifiable, observable phenomenon unlike concepts, it can take on different values the variable empiracally captures the variation within the concept
68
Indicators?
assigning each individual case to different values
69
What do multiple variables and indicators do?
help understand the concept/variable more
70
Level of measurements(3)?
Nomimal-level v. Categories can not be ordered or ranked Ordinal-level v. Ranking relative to the position of other categories; organize them along a continuum b/c we do not know the distance b/w the cat. *agree/disagree qts are useful for this level of measurement Interval-level v. placed on a continuum and the categories are separated by a standard unit
71
Issues of accuracy
Measurement Validity and Reliability
72
Measurement Validity + categories of validity (5)
measures need to be appropriate and complete Face validity : to make it understandable for any reader(logic at the surface) Convergent v. : compares indicators designed to measure the same v. Discriminant v. : compares indicators designed to measure opposite v. (both v. should yield diff. results) Predictive v. : to predict an outcome of a certain v.  Perfect v. : impossible ideal
73
solution to perpetual existence of measurement validity problem
use multiple variables and indicators
74
Reliability
if the measure is consistent regardless of circumstances
75
Reliability does not ensure validity
Random errors : measure is inaccurate,  but the inaccuracy is not systematic(results slightly vary, is reliable) Non-random errors : " " ", but the inaccuracy is systematic(it is not reliable )
76
Logic behind and creation of scales and indexes
It acts as a complex multiple indicators; to quantify the conceptual defintion = meaning becomes more comprehensible combining indicators into indexes to pinpoint wich indicators have the strongest reliability
77
Cronbach's alpha
examines the elements used in the construction of an index; C's a. score = 0 >>> 1(1 being the most reliable)(researchers usually drop the measures from an index if it's below 0.7)
78
question design : qts have to be = 
use neutral language be clear avoid response sets(yea-sayers/ nay-sayers will follow a pattern) keep response categories mutually exclusive and exhaustive (don't know) select the highest lvl of measurement pay close attention to question order minimize defensive reactions
79
Value of the case study's thorough examination of a research topic :
Detailed analysis of a single discrete phenomenon, which begins from the observation of a counter intuitive. Counter-intuitive : condition that occurs when a situation, event, or outcome differs from dominant theoretical expectations or common sense. Qualitative approach that examine specific events on a small scale. Less variables
80
Descriptive Case Study (6)
When a phenomenon is completely novel(fresh) or unknown; emerges from new info; Goal : to describe the phenomenon as the basis of contributing to an emerging of future research agenda; Open-minded researcher follows the unfolding event with a sharp eye; looks for new v. and connections b/w v.; must state scope conditions Scope conditions : the limits to which particular research make valid claims; Restrict insights to the phenomenon and curtails the ability to generalize to a large-n research B/c it is so close to a phenomenon, it is difficult to make generalized statements from them
81
Theory Testing/Modification Case Study(4 + 2 kinds of study)
When a phenomenon shifts its expectations/ theory: Failed most-likely study : expected to confirm it, but refutes it Wants to offer the need to rethink a theory's claims Successful least-likely study : expected to refute it, but confirms it Wants to offer the need to relax/rethink the scope conditions, theory explains more than its proponents claim(original person who advocates the theory) Important distinction : role of falsification = empirical refutation of a theoretical proposition Goal : Theory modification
82
Considerations for Case Study Research (4)
Clear definitions of the subject and object of the study case Does the case study have rigorous and clear conceptualization Case studies are well suited to provide conceptual refinement, whereas statistical research can be at risk of conceptual stretching (using more general conceptual definitions to increase sample size) Does the case study employ process tracing Process tracing : primary mean by which case study research generates causal reasoning Causal pathway is established and leads to the current outcome Applicable generalization to a wider population sample If it is undergeneralized, it can fail the "So what" test
83
Benefits of the comparative approach :
small-n research, systematically contrasts a number of cases in order to create a stronger generalizations(allow greater explanatory power and prediction -> broaden our knowledge of the political world)
84
Issues of the comparative approach :
Not all political units are suitable for all research qts Greater sample randomization = greater sample errors. But to reduce sample errors in smaller sample size makes selected cases less representative Logic of random sampling doesn't hold for such small populations; purposive sampling = allows researcher to use specific knowledge of systems in order to choose political units that could lead to more fruitful comparisons
85
Most-similar-system design :
Similarity of cases means we control for many explanations, one factor lead to different outcomes
86
Most-dissimilar-system design :
Takes vastly dissimilar systems and attempts to explain commonalities b/w them
87
Selecting an appropriate Comparitive Research Design + Galton's problem
Must be careful about the operationalization of v. Must be aware of the social and political context, b/c the op. of one v. depends from one culture to another Goal : equivalent measures, not necessarily identical ones; aim is to measure a concept, use appropriate indicators to diff. Contexts Galton's problem : researcher must ensure that the units under observation are independent of one another; Diffusion of cultural norms and experiences makes cultural comparisons more difficult. Mostly for neighboring countries(US-Canada) or culturally similar countries(France-Canada(Quebec, New-Brunswick)
88
Theory of sampling
Sampling : process of drawing a sample cases from a larger population and selecting a number of cases for further study
89
Logic of drawing representative samples from larger populations
using a sample costs less and takes less time, researchers are able to monitor data collection due to the study's smaller scale
90
Quantitative research
Seeks to measure population characteristics in numeric terms(population parameter) uses probability sampling
91
Qualitative research
Often have smaller sample, but seeks to measure population characteristics Uses non-probability sampling
92
Practical techniques for drawing samples | Representativeness is important to make generalizations; is determined by 3 factors related:
Sample framing: a list of all the units in the target population (accuracy and completeness = no missing cases or inaccurate info)(complete frame is rare, b/c records are incomplete and subject to change) Not all target populations have a listing(direct or indirect) Sampling size: can be divided in two categories: Probability sampling : based on probability theory + allow researchers to use inferential statistics to test representativeness Non-probability: not based on probability theory + researchers can't use statistical analysis to make inferences
93
What does Probability sampling do:
gives confidence in regards to accurate representation of the population
94
What is simple random sampling?
process by which every case in the popu. is listed and the sample is selected randomly from this list
95
What is sampling distribution?(2)
All the sample means for a given sample size | SD of means is created by totaling the number of combinations
96
Confidence interval :
range of values within which the population parameter is likely to fall known as a confidence interval
97
sampling error :
difference b/w sample statistic(estimated value) and population parameter(actual value) when using probability sampling techniques, sapling error is reduced
98
3 factors of sample size:
Goal is to explain heterogeneity or homogeneity Number of variables Desired degree of accuracy
99
Types of sampling methods : (3)
Systematic selection: Selection interval (1/k) e.g. 5% = 1/20, so 1 case out of 20 cases, then we chose a random start(ing number) More practical and efficient than sample random sampling Stratified sampling: breaking the population into mutually exclusive subgroups or strata and randomly sampling each group Disproportionate stratified sampling used to deal with population variances (n*%) is not representative anymore(oversampled), so we assign weights to respondents (proportionate population/ sample size) Cluster sampling: dividing population into a number of subgroups(clusters) and randomly selecting clusters within which to randomly sample Considers geographic units(regions) Adv. : cost reduction, effiency increased Disadv. : may seem un-representative on face value(to the public eye)
100
Types of Non-Probability Sampling(4)
Accidental sample : "accidently" encounters convenient individuals Self-selection: participants are limited to those who opt in, not really representative Purposive sampling/ Judgmental sampling : according to a criteria Snowball(or network) sampling: often employed to study social networks, ask for further referals till the sample, who becomes larger, reaches logistical and financial considerations Quota sampling: accidental/ purposive combined with stratification
101
(Sampling)non-random selection of cases =
no margins of errors or confidence intervals
102
The Problem of Cross-level Inference
Attempt to make inferences about one unit of analysis with data from another unit of analysis e.g. ecological fallacy
103
Most effective way to increase voter turnout
Face-to-Face canvassing (up to 30%) Volunteer phone banks (3-5%) Commercial phone banks Direct snail mail Targeting via social media(60 million ppl, 6%)(might have ethical issues depending on the subject of the experiments) E-mail
104
Unif of analysis in Empirical Research
The unit of analysis in a hypothesis specifies what type of actor the hypothesis applies to e.g. individuals, countries, etc.
105
What is a case?
(def. By John Gerring: a case study is the intensive study of a single case where the purpose of that study is – at least in part – to shed light on a larger class of cases(a population)
106
Charateristics of case studies(5)
explaining a complex theory develop new classifications or concepts pick specific cases "deviant cases" use case study to generate hypothesis or to look at causal mechanism inferences based on one case are less secure
107
Difference b/w descriptive case study and observational studies
IV is not clear, the RQ is wide-open complex units + are named variables emerge during research deliberate choice temporal sequences low external validity and internal depends probabilistic nature -> cannot specify size of effect
108
Risk in single case study
Greater risk of researcher bias luck may validate or invalidate can deviate b/c of a faulty measurement a single negative case cannot invalidate a probabilistic theory
109
Textual analysis + use
Is the systematic examination of the messages and meanings conveyed by texts Text: any form of communication that feature content(words, symbols = has a message) Use: to define/understand ideas, goals, motivations and activities of politicians, poli. org. and institutions. Shed light on political issues and events
110
Content analysis and discourse analysis + disadvantage(1)
Content analysis: quantitative; used to explore the message characteristics Discourse analysis: qualitative; seeks text meaning reflected in content(which in this case is called discourse) Both leave important questions unanswered
111
Features(2)
Structural features: focus on the communication's format and the content's presentation Substantive features: focus on what is said and what is meant; content convey particular meanings, norms and assumptions
112
Type of content analysis(2)
Manifest content; surface meaning of the subject(easier+quantitative) Latent content: underlying or implied meaning(qualitative)
113
Strengths + Weaknesses (7)
Methodology tool for many approaches to poli sci research Used as a part of mixed methods Availability of text(abundance or inaccessibility) is both a S and W Is objective Key assets: Reliability and validity Rigor in quantitative, important message may be ignored b/c of reductionist tendencies(marginalized texts) Qualitative can't disaggregate meanings, only unified whole meanings(bias underlying messages or identifying patterns)
114
Ethical considerations:
It is unobtrusive and non-reactive; part of public domain If it involves private matters, ethical concerns are raised(consent and confidentiality)
115
Interview: definition and 5 basic steps
One person asks questions and the other respond; effort to obtain necessary data by promoting discussion Requires understanding of political process or context to be credible and to avoid missteps Obtains very detailed data, and often private, otherwise inaccessible information Contact developed = trust Is time-consuming Is reactive(desire to self-promote/self-protect) -> misleading, lying, present false information thought to be true Steps: Selecting that kind of individuals suit better the interview Contact potential respondents and request an interview(helpful people first, hostile/busy people last) Clearly know the data you are seeking by making an interview framework(of the qts you want to ask)(positive and neutral) Preschedule the interview to have enough time Awareness of body language + take notes even if you are recording it.
116
Focus group: definition and one basic step
Enable researchers to probe beneath the surface of public opinion(why ppl dislike this or not) Provide context and community perspective on broader issues Discussion(at length and in-depth of the same topic in a structured conversation for an extended period(1-2 hours) Time and money efficient method to obtain data from multiple participants Understanding attitudes >> measuring them Participants have to be knowledgeable, willing and capable of communicating + facilitator has to obtain their trust(skillful) so participants won't conform to a dominant thinking in the group Should have enough participants to yield diversity(6-12) but not too large so people won't get intimidated Research does not always anticipate how the discussion will turn into Has a moderator(must remain focused on the topic) and a note-taker + are video-taped Food facilitates presession conversation Steps: Moderator gives a short intro giving the purpose of the meeting Complements quantitative data
117
Observation research:
Observe actual behavior rather than relying on reported behavior(which may be biased b/c of the researcher or the participants) Known as ethnography Interest: events occurring in natural circumstances Pay attention to context, cultural setting and power relations Inductive and exploratory Obtrusive observation = subjects are aware that they are being observed If participants are told that they are observed = Hawthorne effect Participant observation: researcher becomes part of the community being observed(context-driven > structure)(time-consuming but necessary for valid data) Association =physical risk and damage to researcher's credibility Entrenchment to a group = deviate a research from his task Field notes = data
118
Questions types in Interviews, Focus grouos and observations:
Interviews: promote discussion, not a simple yes-no question Focus groups: open-ended question + framework is more flexible Observations: Observation schedule for a more structured observation(checklist of behaviors; indicators exclusive and exhaustive)
119
Ethical issues in a qualitative research
Protecting the ID of participants for their safety Interviews and focus groups: have to say their intentions(data collection and anonymity) Focus group should not disclose sensitive information Be an honest person = good reputation, competent, knowledgeable and energetic researcher likely to produce something significant