Unit 3 Flashcards Preview

BEHP 5013 M > Unit 3 > Flashcards

Flashcards in Unit 3 Deck (120):
1

Student variables that can influence the number of learn units delivered in a lesson.

Only applies to the fluency stage

Response latency and IRT

2

Definition: frequency of detectable response that a student in minutes during Ongoing instruction

Measure : frequency count of academic responses are instructional. Such as rate/hour

Most direct measurement of student responding during instruction

Active student responding – ASR

3

“It has long been said that college
teaching is the only profession for which
there is no professional training. Would-be
doctors go to medical schools, would-be
lawyers go to law schools, would-be
engineers go to institutes of technology, and
would be college teachers just start
teaching. Fortunately, it is recognized that
grade-and high school teachers need to
learn to teach. The trouble is, they are not
being taught in effective ways.”
(American Psychologist, 39)

B.F. Skinner (1984)
The Shame of American Education

4

“…the advances which have recently
been made in our control of the
learning process suggest a thorough
revision of classroom practices and ,
fortunately, they tell us how the revision
can be brought about…The modern
classroom does not, however, offer
much evidence that research in the
field of learning has been respected or used

B.F. Skinner (1968)
The Technology of Teaching

5

“We know how to build better
schools.”
(Recent Issues in the Analysis of
Behavior, p. 96)

B.F. Skinner (1989)
The School of the Future

6

“It is hard to keep your humor when
you accept the fact that you invested 25
years in developing methods that can
help your nation out of the educational
abyss into which it is racing. You made
these methods inexpensive. You made
them clear. You helped illustrate their
worth. You made them attractive. Yet
they are ignored or rejected because of
popular myth and bigotry.”
(Journal of Applied Behavior Analysis)

Ogden R. Lindsley (1968)

7

“With precision, rigor and results as
their hallmark, behavior analysts have
documented, in literally hundreds of
investigations and numerous specialty
journals, the scope and effectiveness of
their behavioral technology…In effect,
the relevance of these rigorous
methods is being seriously questioned
by many educators.”
(Journal of Applied Behavior Analysis)

Fantuzzo & Atkins (1992)

8

Was planned to be a large-scale
service delivery program
Was the largest scale research
program on the effectiveness of
educational practices
Compared nine different approaches
to education; two were behavioral

Project Follow Through
(1968-1976)

9

The two behavioral approaches
clearly surpassed the other seven
approaches in all measures including
effectiveness
 Direct Instruction
 University of Kansas Behavior Analysis
Program
Despite all the evidence, schools are
still allowed to choose what approach
to use

Project Follow Through: Results

10

The Story of the US Department of
Education “Researcher” that sat next to
me on a plane

An Example of the Failure of
Project Follow Through

11

Behavior analysis has evidence-based
technologies that result in better
educational outcomes in schools
 Educators should be trained to teach
 Educators should be required to use
evidence-based best-practices
 The public school system in the USA
has failed to adopt best-practices

Main Points

12

Available Time
Allocated Time
Instructional Time
Engaged Time
Academic Learning Time

Time and Learning

13

Total number of school days and hours
 Primarily used by teacher for
management and transition tasks, or
students’ passive attending to teacher
 Not related to academic achievement
 “Clearly, student learning depends on
how the available time is used, not just
the amount of time available.”
(Stallings, 1980)

Available Time

14

AVAILABLE TIME: 6 hours = 100%
The amount of time available for all school activities in a school day

Time and the School Day

15

Amount of time scheduled for
instruction
Not related to academic achievement

Allocated Time

16

AVAILABLE TIME: 6 hours = 100%
The amount of time available for all school activities in a school day
ALLOCATED TIME = 79%
The amount of time allocated for instruction in a content area

Time and the School Day

17

How long does it take to get the show on the
road?” (Stallings, 1980)
 Usually less than “Allocated Time”
 Number of minutes instruction is actually
delivered
 Not a measure of student behavior, only
presence during instruction
 Some impact on achievement
 Best if instruction is interactive (support and
feedback

Instructional Time

18

“Attending” to ongoing instruction
Not a measure of true learning
Increased “on-task” does not ensure
increased academic responses
(looking at teacher, silent reading,
independent seatwork, etc.)

Engaged or “On-Task” Time

19

The time that students actually spend
learning!
The amount of time successfully
engaged in academic tasks

Academic Learning Time

20

AVAILABLE TIME: 6 hours = 100%
The amount of time available for all school activities in a school day
ALLOCATED TIME = 79%
The amount of time allocated for instruction in a content area
INSTRUCTIONAL TIME = Less than 79%
The amount of time the teacher actually delivers instruction
ENGAGED TIME: Avg. = 42% Range = 25-58%
The amount of time the student is actively engaged in
learning tasks
ACADEMIC LEARNING TIME (ALT):
Avg. = 17%; Range = 10-25%
The amount of time successfully engaged in academic
tasks

Time and the School Day

21

We need to teach better
It is NOT how much time the student
spends in the classroom
It is NOT how much time the teacher
spends lecturing
It IS how much time the student
spends LEARNING

Bottom Line

22

Principles of learning
The operant as the basic unit
Interactive not passive
Measurement and evaluation of
educational outcomes
Developed and validated an effective
technology of instructional design
and instructional delivery (teaching)

The Role of Behavior Analysis
in Education

23

1.Be clear about what is being taught
2. Teach first things first
3.Stop making all students advance at
essentially the same rate
4.Program the subject matter

Skinner (1984), The Shame of American Education,
American Psychologist, 39

24

5. Reconsider ABA Instructional Technology
methods in light of diverse student
populations, and suitability for the schools
and educators of today
6. Determine how to cause more durable
and extensive behavior change (e.g.,
academic performance, social skills)
7. Develop methods that teachers can and
will actually use

Fantuzzo & Atkins (1992),
Journal of Applied Behavior Analysis 25

25

 School systems hire behavior analysts
mostly to:
 Work on behavior excesses (discipline)
 Conduct FBAs and write Bx. Plans
 Work with “Deep End” cases…
 Many in school systems do not see how
we can help with curriculum and
instruction as they have “specialists” in
these areas.

Some of the Challenges Facing Us

26

Measurable objectives and outcomes
Content and task analyses
 Critical and variable attributes
 Examples and non-examples
Criterion-based tests
 Designed prior to instruction
Goal: Mastery
Specify entry repertoire of the learner
Establish instructional sequence
Develop procedures for ongoing data
collection throughout instruction
Revise curriculum based on data

Instructional Design (ID)

27

Grouping by skill level
Present/demonstrate (short frames)
Frequent active student responding
Immediate feedback
Progress based on mastery
Ongoing measurement & evaluation
Revise instruction based on data

Instructional Delivery (Teaching)

28

 Clearly specified and behaviorallystated
instructional objectives
 Well-designed curricular materials
 Assessment of learner’s entry skills
 Ongoing frequent direct measurement
of skills
 Focus on mastery (accuracy, fluency,
etc.)
Highly structured
Fast-paced
Active engagement
 Active Student Responding (ASR)
 High level of daily practice
Systematic use of positive and
corrective feedback
Supported by much empirical
research
 Question: What is the extent and
quality of empirical evidence
demonstrating its effectiveness?
Extensively field-tested and revised
based on data
Considers how realistic the
procedures are for classroom
practice

Elements of the ABA Approach
to Education

29

A statement of actions a student
should perform after completing one
or more instructional components
The actions must be specific,
observable and measurable

Behaviorally-stated
Instructional Objectives

30

 To:
Guide the instructional content and
tasks
Communicate to students on what they will be evaluated
Specify the standards for evaluating
ongoing and terminal performance

Behaviorally Stated instructional Objectives - Reasons for Writing

31

Example: State the main differences
between radical and methodological
behaviorism
Non-Example: Understand the main
differences between radical and
methodological behaviorism
Example: List all the elements in the
periodic table
Non-Example: Know the periodic
table

Instructional Objectives

32

“Level of performance that meets
accuracy and fluency criteria no
matter when assessed.” Vargas (2013)
“… performance that is accurate,
speedy, durable, smooth, and
useful.” Johnson & Street (2004)
Resistant to distractions

Mastery

33

Correct versus incorrect

Accuracy of the response

34

Short latency: high rate of correct responses

Fluency

35

Maintains across time even after instruction ends

Performance on final exam should be no lower than Test performance

Durable

36

Free of positives and false starts

Smooth

37

Pop quizzes, unannounced tests, Final exam still meets criteria

No matter when assessed

38

Can’ apply in real world; Contextually meaningful:Socially valid

Useful application

39

Performance consistent even when there are environmental distractions

Resistant to distractions

40

In actual practice A criterion is set to a specific score on a test

Passing score – mastery criteria – sometimes set as 80%

Mastery criterion preset Prior to instruction

Examples of mastery

41

Results of other students has no effect On ones score

In some courses students cannot access new material until The criterion is met

Criterion based evaluations

42

Score is based on and compare to peers Performance

Examples: curving the results of a test; standardized test, I Q test, aptitude tests

Norm referenced evaluations

43

Generative learning same as…

Adduction

44

A general pattern of responding that produces effective responding to many untrained relations

The learner master is a generative set Which can combine and recombine Into the universal set of all possible relations.

Untaught complex responses emerge in relation to basic elements that are taught.
Johnson


Generative learning: Adduction

45

The production of a novel behavior when new combination of stimulus properties that separately control different classes Or properties of behavior that engender new Combinations of those classes or properties…

Novel combination of different repertoires. Can be either sequential (serial coordination), or simultaneous (parallel coordination),

Most evident in verbal behavior
Catania

Adduction

46

Nick sees a red door for the first time And says red door. Child combined two skills; Tacting the color red and tacting door

After blending 40 Letter sound combinations, Kelly reads 500,000 novel words correctly

Adduction Examples

47

Teaching procedures that lead to Adduction

Developed by Johnson and Layng

Generalized Instruction

48

Describes the emergence of accurate responding to untrained and non-reinforced stimulus stimulus relations to follow the reinforcement of responses to some other a stimulus stimulus relations

Exist when a learner correctly identifies a symbolic relationship between two or more not identical stimuli without specific training on that relationship.
In other words, Malana makes untrained not accurate connections between stimuli

Stimulus equivalence

49

Reflexivity

Symmetry

Transitivity

Stimulus equivalence

50

A form of stimulus equivalence. Occurs when, in the absence of training and reinforcement, a learn to select a stimulus that is matched to itself. Example A equals A.

Generalized matching

Reflexivity

51

A learner is shown a picture of a bicycle and three choice pictures of a car an airplane and a bicycle.

Also called generalized identity matching, has occurred if the participant without instruction select the bicycle from the three choices pictures.

Sample of reflexivity

52

A form of stimulus equivalence.

Occurs when, after learning that A equals B, the learner demonstrates that B equals A without direct training on that relationship.

Symmetry

53

When the participant learns that the written word bicycle, sample stimulus A, matches a comparison picture of the Bicycle – comparison B, and then without additional training or reinforcement, the participant when presented with the picture of a bicycle select a comparison between word bicycle

Example of symmetry

54

A form of Stimulus Equivalence occurs when after learning that A = B And B= C,the learner demonstrates that A=C without direct training on that relationship

Transitivity

55

The final and critical test for equivalents
Requires the demonstration of THREE untrained Untrained stimulus stimulus

Transitivity

56

If A (written word truck) = B ( Picture of a truck) and

B (The picture, truck). = C (written word Camion), then....

C (written word, camion) = A (written word truck) Emerges without additional instruction or reinforcement

Example of transitivity

57

Well designed curriculum and instructional programs take advantage of knowledge of how these are acquired

Learners able to get recombine a small set of previously learned relations into much larger repertoire of equivalent relations Without specific training on the new relations

Stimulus Equivalence

58

Related concept – relational frames

Relational frame theory – RFT

Acceptance and commitment therapy –ACT

Adduction and stimulus equivalence

59

Based on skinners operant

Antecedent – response – consequence

Discrete trial procedures exemplify this

The basic instructional Unit

60

The learning trial

The learn unit - Greer

Other terms for the instructional unit

61

Says..that the learning unit represents the teachers opportunity to teach and the students opportunity to learn

Says that a good measure of a teacher’s performance is how many learn units he or she delivers pearl lesson or amount of time during instruction
.....The smallest divisible unit of teaching and incorporates interlocking three term contingency’s for both the teacher and the student… When used with rigorous assessment of learning objectives, they are direct measures of schooling effectiveness

Greer’s Learn Unit paraphrases

62

Establishing a new behavior, scale or repertoire

Note: repertoire for repertory is a response class hierarchy. These terms are commonly used in the literature of learning and education. For example, the student learns a new repertoire when she learns a group of math skills.

Acquisition stage

63

Student actively learning a new skill such as sounding out consonant Vowel combinations

You are actively learning the definition of new term, e.g., Adduction

Acquisition Stage examples

64

 Accuracy: Correctness of the response;
correct vs. incorrect
 Fluency (speediness): Short latency;
high rate of correct responses
 Durable: Maintains across time even
after instruction ends; performance in a
final exam should be no lower than unit
test performance
 Smooth: Free of pauses & false starts

Fluency Stage

65

How many US states can one label
in a map, correctly, in one minute
How many 2-digit times 2-digit
multiplication problems can one
perform correctly in one minute
Words read correctly per minute
Words typed correctly per minute

Examples of Fluency

66

 Using learned material in new, concrete,
and real-life situations
 Most of the tasks in the Task List are at
the application level
 For example, “Use Direct Instruction”
 Active verbs for application objectives:
 Apply, use, implement, operate, carry
out, make, manipulate, demonstrate,
praise

Application Stage

67

Instructional Antecedent: “What is the sum of 2 plus 2?”....(Wait Time and Response Latency)
Student Response: ( Feedback Delay) Feedback Correct. = Time
Feedback Correct -Feedback focus on errorless
learning, response accuracy and topography

Acquisition Stage of Learning

68

Which is a teacher controlled variable that determines how many learn units can be delivered?

A. Feedback delay
B. Inter trial intervall
C. Wait time
D. All of these

D. All of these feedback A.CHAL wait

69

Are influences on the number of
learn units
 That can be delivered (teacher’s
perspective)
 That can be experienced (learner’s
perspective)

Wait Time, Response Latency,
Feedback Delay, and Intertrial Interval

70

Teacher-controlled variables that can
be manipulated to control the number
of:
 Learning trials
 Learner performances

Wait Time, Feedback Delay, and
Intertrial Interval

71

Student variables that can influence
the number of learn units delivered in
a lesson.
IRT only applies to the fluency stage.

Response Latency and IRT

72

Definition: Frequency of detectable
responses that a student emits
during ongoing instruction
Measure: Frequency count of
academic responses per instructional
period (e.g., rate/hour)
Most direct measure of student
responding during instruction

Active Student Responding (ASR)

73

Pays attention
 Listens to the teacher
 Watches others respond

Passive Responding

74

How does one measure covert
learning or thinking?
Teacher may continue ineffective
instruction
Teacher may waste time on material
already mastered by students

Problems

75

“Uh-huh” (false yes)
 Gets teacher praise and smile
 Avoids teacher look of disappointment
or “Why not? Weren’t you paying
attention?”
 Escapes more or repeat instructions
 Avoids looking stupid to peers
“Uh-huh” (falsely believes he does)

Problems with
“Do you understand?”

76

Correction or remediation may be too late
 Teacher has moved on to other parts of
curriculum
 Low test results blamed on student (low
intelligence, low motivation, sensory
problems, poor sleep patters or nutrition,
etc.)
 Contributes to Refer – Test – Place model

Problems with
Tests Given After Instruction

77

Correlated with:
 Increased academic behavior!
 Improved test scores!!
 Reduced disruptive behavior!!!
These are strong reasons that
teachers should accept and maintain
ASR methods

Active Student Responding

78

 Ensures relevant responses are occurring
 Detectable responses can get immediate
feedback from teacher, other students, or
the student herself
 Teacher can modify instruction as needed
 Consistent problems can get quick
remediation

Active Student Responding

79

Programmed Instruction (PI)
Personalized System of Instruction
(PSI)
Direct Instruction (DI)
Precision Teaching (PT)
Morningside Model

High ASR Approaches to
Instructional Technology

80

PI, PSI, DI, PT, and the Morningside
Model all use the Learn Unit and
provide lots of ASR.
But first, how can a teacher use
some simple tools to increase ASR
even without formal training and fully
using one of these technologies?

ABA Instructional Technology

81

Simple, low-cost systems are needed
so that teachers can and will actually
use ASR methods.

Low-tech ASR for Group Instruction

82

High rates of active student
responses
Immediate view of student responses
as they occur
Student-to-student interactio

Low Tech ASR-Required Features

83

Response Cards and similar
techniques
Choral Responding
Response Cards PLUS Choral
Responding
Guided Notes
 Heward (1994

Low-tech ASR for Group Instruction

84

Cards, signs, or items that are held
up simultaneously by all students to
display their response to a question,
item, or problem presented by the
teacher.

Response Cards

85

Easy to detect incorrect responses
Much higher response rates
compared to typical hand-raising and
one-at-a time answering method
Students can learn by watching
others

Advantages of Response Cards

86

1.Preprinted Selection-Based
Response Cards
2.Preprinted Selection-Based “Pincher”
Response Cards
3.“Write-on” Response Cards

3 Types of Response Cards

87

Answers are printed or written on a
small set of cards
Examples
 Yes-True, No-False
 Numbers
 Colors
 Parts of speech
 Cardboard clock with moveable hands

Preprinted Response Card “Sets”

88

Answers are printed or written on a
small card
Student pinches correct answer with
finger or clothespin

Examples:
 Seasons of year
 Numbers
 Colors

Preprinted “Pincher”
Response Cards

89

Advantages
 Higher ASR rates
 Can gradually add cards (fewer errors)
 Easy to see

Disadvantages
 Limited to recognition questions
 Limited to responses on cards
 Excludes questions with many answers

Preprinted Response Cards

90

Hand signals
Key pads
Polls in your online co-instructor
meetings

Techniques Similar to
Selection-Based Response Cards

91

 Small blank cards or boards (e.g., 8 X10”)
 Laminated bathroom board, dry erase
markers
 Chalk board, chalk (messy, harder to see)
 Erased between trials (use socks on nonwriting
hand)
 Can include cues in background or
margin (e.g., musical clef scales, map)

“Write-On” Response Cards

92

 Advantages
 Allows questions with many answers
 Allows more difficult recall questions
 Can target spelling skills
 Allows creative answers
 Disadvantages
 May be lower ASR rates, more errors
 Hard to see or interpret writing
 Messy and sometimes smelly

“Write-on” Response Cards

93

 Limit language responses to 1 or 2 words
 Keep spare markers on hand
 Reduce errors by practicing spelling of
response words before the lesson, or list
possible responses for students to see
 Tell students to try their best, but
misspelling will not be counted as errors
 After a good lesson, let them draw

“Write-on” Response Cards

94

 Students respond orally in unison
 Easiest ASR method
 Commonly used in lower grades
 Not frequently used by current teachers
 Effective with young children and older
students
 Students can learn by listening to others

Choral Responding

95

Good for lecture-style instruction

Guided Notes

96

Teacher responses
 Present new information
 Review information
 Demonstrate
 Illustrate
Student responses
 Listen
 Watch
 Ask
 Accurately record the most important
info for future study

Lecture Typical in
Middle and Secondary Classes

97

Students who record accurate notes
perform better on exams
Most students receive no explicit
instruction on how to take notes
Persons with disabilities are
particularly challenged

Lecture Typical in
Middle and Secondary Classes

98

Teacher-Prepared Handouts that:
 Organize content
 Guides the learner with standard cues
for the learner to record key facts,
concepts, and relationships
 Provides the learner with a means of
actively responding to the lecture
content
 Provides a take-home product for study
 Keeps teacher “on-task” during lecture

Guided Notes

99

 Developed and tested by Skinner

 Involves the presentation of small
"frames" of information, Each requires a DISCRIMINATED response

 Learner moves through the sequence of
frames at their OWN PACE Usually with FEW errors

 FREQUENT positive feedback

STRUCTURED so that each student interacts in a way that MAXIMIZES learning

Depending on the students’
RESPONSES to questions, it branches
(or loops) into either new material or
review frames

Programmed Instruction

100

Usually, uses teaching machines or
computers
Skinner’s Teaching Machine
Tucci’s Teaching Machine
Holland and Skinner

Programmed Instruction

101

developed and tested by
Fred Keller (e.g., Keller 1968)
High mastery standards
achieve standards at own pace
Used successfully in colleges,
universities, and the military

Structured instructional materials:
 Information – Frames – Quiz
 Frequent ASR
 Self (student)-pacing
Unit tests
100% mastery
Optional lectures
Undergraduate proctors

Personalized System of Instruction

102

Unit tests
100% mastery
Optional lectures
Undergraduate proctors

Personalized System of Instruction
(continued)

103

Results in better
learning and retention than traditional
instruction
 Kulik, Kulik, & Cohen (1979)

Personalized System of Instruction

104

Incompletes and withdrawals
Deadlines
Can finish course faster than
semester but must finish by end

Problems with PSI and Solutions

105


Engelmann & Carnine, 1991)
Follows a logical analysis of concepts
and procedures
Presents examples and non-examples
in an instructional sequence that
fosters rapid concept learning
Small group instruction
Frequent active student responding
Teacher follows SCRIPT (pre-tested)
Uses instructional objectives

Direct Instruction (DI)

106

“Teach more in less time”
Focus: General-case teaching
Examples and non-examples
Frequent monitoring: Criterionreferenced
tests

Direct Instruction (continued)

107

Strong evidence it's most
effective form of academic
preparation and maintenance in
elementary grades
Head Start and Follow-Through
programs
 (Hempenstall, 2004; Watkins, 1988

Direct Instruction

108

Developed and tested by Ogden
Lindsley (e.g., Lindsley, 1992)
Focuses on learner’s performances
as a means to assess interventions
Frequency of responses tracked and
charted on a standardized chart
 Teacher can make informed choices in
instruction

Precision Teaching

109

Student knows best”

Use of rate of response…

Fluency – not just accuracy

SAFMEDS

SYANDARD Celeration Chart

Precision Teaching Components

110


Structured instructional materials:
 Information – Frames – Quiz
 Frequent ASR
 Self (student)-pacing
Unit tests
100% mastery
Optional lectures
Undergraduate proctors

PSI

111

Teacher follows a script is this strategy

Direct instruction

112

Follows a logical analysis of concepts
and procedures

Direct instruction

113

Involves the presentation of small
"frames" of information

Programmed instruction

114

Used successfully in colleges,
universities, and the military

Personalized system of instruction – PSI

115

FREQUENT positive feedback

STRUCTURED so that each student interacts in a way that MAXIMIZES learning

Programmed instruction

116

Student knows Best

Precision teaching

117

Uses SAFMEDS And Standard Celeration Charts

Precision teaching

118

SAFMEDS

Say all fast minute every day shuffle

119

Develop by Johnson and Laying

Combines elements of PI, PSI, DI, PT With their own components/composite analysis of learning

Morningside model

120

Produces faster than average learning with populations of learners who are significantly behind grade level in academic subjects.

Some with diagnosis of ADHD or other learning disabilities such as dyslexia

Money back guarantee at Seattle location

Morningside model and I was it sure if