Exam 1 Flashcards

(132 cards)

1
Q

What is a claim

A

An assertion made by

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is evidence?

A

Is a reason and connected with a warrant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a warrant?

A

It is connect to evidence and a claim

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is backing?

A

Additional evidence to support the warrant when a counter argument can be made.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the 5 Ways of Knowing?

A
  1. Personal Experience
  2. Intuition
  3. Authority
  4. Appeals to tradition, custom, and faith
  5. Magic, superstition, and mysticism
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Personal Experience…

A

Tends to be seen as the most trustworthy but is biased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Intuition…

A

Perceptions like cloud figures => biased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Authority…

A

Trust on people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Appeals to tradition, custom, and faith…

A

Can lead to stereotypes because it is always been like that

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Magic, superstition, and mysticism…

A

Mysteries are used to explain the unexplainable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the 6 Characteristics of Research?

A
  1. Research is based on curiosity and asking questions
  2. Research is a systematic process
  3. Research is potentially replicable
  4. Research is reflexive and self-critical => knows its limitations
  5. Research is cumulative and self-correcting => others can add
  6. Research is cyclical => continuous
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is systematic process?

A

5 step-by-step phases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the 5 step-by-step phases?

A
  1. Conceptualization what needs to be studied
  2. Planning and designing
  3. Methodologies
  4. Data analysis
  5. Reconceptualization of what studied and learned
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is proprietary research?

A

For a specific audience (i.e a teacher for her own reflection)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is scholarly research?

A

For public access

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are the 3 Academic Cultures of Research?

A
  1. Physical science
  2. Humanities
  3. Social or human science
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is physical science?

A

biology, chemistry, physics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

humanities?

A

art, music, literature

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

social or human science?

A

human behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Communication overlaps with which 3 Academic Cultures of Research?

A

communication overlaps with all 3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Positivist paradigm vs. naturalistic paradigms

A

how these paradigms approach the “ologies”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is a paradigm

A

a worldview

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

positivist paradigm?

A

emphasize the word science in social science by trying to use physical science methods to study human behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

positivist paradigm - ontological?

A

singular reality and objective

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
positivist paradigm - epistemological?
there is an independent relationship between researcher and participants
26
positivist paradigm - axiological?
the researcher’s values and biases have no effect
27
positivist paradigm - methodological?
the methods preferred is deduction (from general to specific), cause and effect relationships, research-controlled settings, and quantitative methods
28
positivist paradigm - rhetorical assumption?
formal and impersonal
29
naturalistic paradigm?
emphasize the word social in social science by trying to develop new methods to capture social behaviour
30
naturalistic paradigm - ontological?
multiple realities
31
naturalistic paradigm - epistemological?
there is an interdependent relationship between researcher and participants
32
naturalistic paradigm - axiological
the researcher’s values and biases have an effect
33
naturalistic paradigm - methodological?
the methods preferred is induction (from specific to general), holistic understanding, natural settings, and qualitative methods
34
naturalistic paradigm - rhetorical assumption?
informal and personal
35
Definition of communication
the process by which verbal and nonverbal messages are used to create and share meaning. Making things common ⇒ information exchange perspective
36
Communication research
focus on messages and message creating behaviors
37
Definition of technical communication
the process of making technical messages accessible to a lay audience
38
Basic Research
-Nature of problem -theory -commonsense theories -Goals -methodology
39
methodology of basic research
hypothesis testing
40
Goals of basic research
to increased knowledge on communication phenomena because theories are ongoing and can always benefit from fine tuning
41
Theory
a generalization made to explain why something happens
42
Nature of problem of basic research
research done to test a theory and make generalizations about communication
43
Applied Research
-Nature of problem Focus on a specific event or challenge to make generalizations -goals of action research -social justice communication research -methodology
44
Nature of problem for action reasearch
research done to solve a problem
45
goals of action research
engaged in not only finding a solution but implementing it
46
social justice communication research
focus on the underrepresented
47
methodology of applied research
observe and test out solution
48
Reasons for reviewing previous research
-To get an understanding of what you are studying by learning what others have said before -To find gaps in the research -To refine a research question -To design own study
49
What are scholarly research articles?
-Primary research reports -Published in journals that are run by professionals in each discipline -They have gone through peer-review process
50
Primary research reports
the first reporting of a study by the people who conducted the study
51
How is research presented?
-Reading scholarly Journal articles -They represent the most up to date research in the field -Meant to be read a report of the findings from the study
52
Typical Quantitative Scholarly Journal Article
-Title -Abstract -Introduction -Literature review -Methodology -Results -Discussion -References
53
Title
present the topic and variables studied
54
Abstract
summary of the purpose of the study, methods, key findings, and contributions
55
Introduction
establishes the purpose and significance of the study
56
Literature review
an establishment of the previous work done by others
57
Research question/ hypothesis
concludes the LR
58
Methodology in an article
an explanation of how the study took place
59
Participants.
people or texts studied
60
Procedures
the step-by-step
61
Data treatment
how data was analyzed
62
Results
summary of what data was collected
63
Discussion
interpretations of the results, problems and limitations are shared
64
References
list of sources
65
SPSS
a software used to help a researcher identify patterns in data
66
data page
the page where the data is imputed
67
variable page
where the labels for the data are added (i.e age, major)
68
Conceptual definitions
-Dictionary-like definitions that describes a concept with other terms (i.e argumentative = to debate) -abstract
69
Operational definitions
-meaning is constructed by defining what activities are needed to measure it (i.e love = you do these nice things) -concrete
70
Measurement theory
-Determining how the variables will be observed -Is the process of determining changes within variable in terms of size, characteristic, or quantity
71
Quantitative
use numerical values to determine the amount of something (i.e 250 pounds)
72
Qualitative
use symbols to appoint meaning (i.e heavy)
73
triangulation
studying something in multiple ways within a single study
74
Methodological - triangulation
using multiple methods to study same phenomena
75
Data - triangulation
different sources for data collection were used
76
researcher - triangulation
multiple researchers collected and analyzed the data
77
Theoretical - triangulation
use multiple perspectives to interpret same data
78
Levels of Measurement
NOIR
79
Nominal ⇒ classification
-mutually exclusive (can’t belong to multiple groups), equivalent, exhaustive -Examples ⇒ yes/no question, select from a checklist, open-ended questions then categorized -Pro ⇒ can lead to important findings -Con ⇒ can be limiting
80
Ordinal ⇒ rank order
-(fixed measurements for greater than to less than like sibling ranking) -Pro ⇒ turn discrete classifications into ordered classifications -Con ⇒ can’t tell a researcher how much of a variable was measured (i.e how much age difference is between the siblings)
81
Interval (types) ⇒ Likert Scale
(ratings by perception, (-1, 0, 1 ⇒ 0 is not absence but a point in the scale)
82
Ratio ⇒ counts
0 means absence; no negative numbers
83
Unidimensional
indicators that can be added together toward a single, overall score
84
Multidimensional
concept is made up of independent factors
85
Measurement Methods
-Self-Reports -Other’s reports -Behavioral Acts
86
Self-Reports
-asking people to report on themselves. -Pro ⇒ This is a good way to learn people’s beliefs, attitudes, and values. -Con ⇒ people can provide inaccurate information if they can’t remember or may be biased
87
Other’s reports
-asking people to observe other people -Pro ⇒ may remove some biases (example a professor will have biases on the clarity of their teaching than a student) -Con ⇒ the person may not have enough knowledge on an observation; doesn’t remove 100% biases
88
Behavioral Acts
-the researcher observes a person’s behavior -Pro ⇒ can reveal if what they say matches what they do -Con ⇒ can’t show how people feel or think or interest
89
Measurement Techniques
-Questionnaires -Interviews -Observations
90
Questionnaires
written questions that yield written responses
91
Questionnaires: What are they used for?
to measure variables
92
Interviews
verbal questions that yield verbal responses
93
Closed questions
provide participants with preselected answers
94
Open questions
participants use their own words to respond to questions
95
Directive questionnaires & interviews
predetermined set of questions
96
Nondirective questionnaires & interviews
respondent’s initial responses determine what they will be asked next
97
Observations
inspection and interpretation of behavior
98
Direct observation
researchers watch people engage with communication directly
99
Indirect observation
researchers observe communication artifacts
100
What is validity?
The degree of accuracy that a researcher is measuring what they claim to be measuring
101
Internal validity
deals with the accuracy of conclusions made
102
External validity
deals with the generalizability of the findings from a study
103
The best valid studies are those that are …
those that are high on both internal and external validity
104
What is reliability?
To be consistent and stable
105
Reliable
70% or 1.0
106
What is measurement validity?
How well a researcher’s methods measure what they intend to measure
107
What is measurement reliability?
When what is measured is consistent and reliable
108
Trust score component
the score if everything was perfect
109
Error score component
deviation to take into consideration that people’s behavior changes and fluctuates
110
Techniques to assess reliability
*Multiple-administration techniques *Single-administration techniques
111
Multiple-administration techniques
test-retest method
112
Test retest method
administers the same procedures to the same people at different times
113
Single-administration techniques
-split half reliability -Cronbach’s alpha -intercoder reliability
114
split half reliability
dividing the responses in half where 1st half is similar to 2nd half
115
Cronbach’s alpha
every item is compared to every item
116
intercoder reliability
coding is stable when a data is coded by multiple people, multiple times
117
Validity threats due to
-How the research is conducted -the research participants -The researcher effects
118
History
external factors that influence people’s behavior in the study
119
Sleeper effect
effects that take time to manifest
120
Sensitization
initial measurements influence the latter
121
How does Data analysis impact validity?
improper procedures
122
How the research is conducted
-History -Sleeper effect -Sensitization -Data analysis
123
the research participants
-The Hawthorne effect -Selection -Mortality -Maturation -Interparticipant bias
124
The Hawthorne effect
if people are aware of a researcher’s intent, it will influence their behavior
125
Selection
the people selected should belong to the group for which they were selected for (i.e a 15-year-old should be 15)
126
Mortality
the loss of participants during the study
127
Maturation
changes within a participant that affects their behavior
128
Interparticipant bias
participants influence others in a study
129
Researcher personal attribute effect
when the researcher influences people’s behavior
129
The researcher effects
-Researcher personal attribute effect -Researcher unintentional expectancy effect -Researcher observational biases
130
Researcher unintentional expectancy effect
influence through indirectly informing people of the desired behavior
131
Researcher observational biases
when a researcher’s knowledge influenced their observations by focusing on desired outcome