ABA 100 - chapter 10 - Lecture notes 10 PDF

Title ABA 100 - chapter 10 - Lecture notes 10
Course Applied Behaviour Analysis
Institution Capilano University
Pages 7
File Size 136.6 KB
File Type PDF
Total Downloads 25
Total Views 190

Summary

Chapter 10...


Description

ABA 100 – Schedule for Reinforcement

Chapter 10 Developing Behavioral Persistence with Schedules of Reinforcement • Learning Objectives • Define intermittent reinforcement • Compare intermittent and continuous reinforcement • Define ratio schedules, interval schedules, limited hold, duration schedules, and concurrent schedules • Explain how a common pitfall of intermittent reinforcement of traps not only the uninitiated but also those with some knowledge of behavior modification Schedules of Reinforcement “A schedule of reinforcement is a rule specifying which occurrences of a given behavior, if any, will be reinforced.”

Continuous Reinforcement (CRF) • EACH instance of behavior is reinforced • Open your eyes and see; Flick switch and light comes on • YOUR EXAMPLES? • Advantageous for skill acquisition (see next video) • Can be problematic • Continuous reinforcement for social media checking’ behavior can interfere with other necessary tasks

Extinction… the OPPOSITE of CR • A behavior that used to be reinforced no longer receives reinforcement • Sometimes referred to as operant extinction • Under what conditions would extinction occur for: • Flicking light switches? • Checking text messages? • YOUR EXAMPLES?

Intermittent Reinforcement If CONTINUOUS REINFORCEMENT and EXTINCTION are the ‘bookends’ then everything in between is intermittent reinforcement.

Intermittent Reinforcement

• Intermittent schedules of reinforcement mean only some responses are reinforced • Intermittent schedules of reinforcement are necessary for the progression to naturally occurring reinforcement during teaching. • Most of us are not reinforced for every single response we engage in…yet we continue to respond! (examples?) • Intermittent schedules of reinforcement are used to strengthen behaviors that have already been established • Appropriate for maintaining behaviors (e.g., child’s manners with intermittent praise)

Ratio and Interval Schedules of Reinforcement • RATIO schedules require a number of responses before one response produces reinforcement = DOING • INTERVAL schedules require time to elapse before a response produces reinforcement Remembering Ratios vs. Intervals •Ratio = # •Interval = time Fixed-Ratio (FR) Schedule • The reinforcer is delivered after every X responses where X is the size of the ratio • FR 2 = every 2nd response is reinforced • FR 50 = every 50th response is reinforced • Characterized by consistent, rapid responding interrupted by a post-reinforcement pause • Large ratios produce longer pauses than short ratios • Larger ratio requirements tend to yield higher rates of responding • Amy (a good speller) will get her spelling homework done faster if she is reinforced on an FR 5 schedule than on an FR 2 schedule for words spelled correctly

Fixed-Ratio (FR) Schedule • The reinforcer is delivered after every X responses where X is the size of the ratio • FR 2 = every 2nd response is reinforced • FR 50 = every 50th response is reinforced YOUR EXAMPLES Fixed-ratio; reinforcement is given as a specific number of responses. Give consequence of enough average number required – quick response / brief pauses / quickest extinction FR Schedules & Post-Reinforcement Pause • Consider the post-reinforcement pause for a ditch-digger • FR 2 vs. FR 50

• Consider the same ditch digger’s responding • FR 1 to get paid $100 • FR 3 to get paid $100 • FR 50 to get paid $100 • The sudden and dramatic increase from FR 3 to FR 50 will likely result in ‘ratio strain’

Ratio Strain • A behavioral effect associated with sharp increases in ratio requirements when moving from dense (FR 1, FR 2, FR 3…) to thin (FR 20, FR 30, FR 40…) schedules • Effects of ratio strain typically include avoidance, aggression and/or lengthy pauses or cessation in responding

Variable Ratio (VR) Schedule • The reinforcer is delivered after a variable number of responses, based on an average • VR 5 means an average of 5 responses is required • 5 responses = reinforcement • 6 responses = reinforcement 5 + 6 + 4 = 15/3 = 5 • 4 responses = reinforcement • This is the schedule of reinforcement used by slot machines – random and variable Variable Ratio: high response rate / no pauses / extinction slower than fixed ratio

VR Schedule • Produces consistent, steady rates of responding • The learner doesn’t know when the next reinforcer will be delivered, so she keeps on responding • Not characterized by a post reinforcement pause • Because the next response may produce reinforcement

Ratio Schedules • Free-operant procedure • The individual is free to respond at various rates – there are no limits • Example: sitting at slot machine putting in coins • FREE TO RESPOND at your own pace • Your examples? • Discrete-trial (restricted operant) procedure • The individual’s responding is limited by environmental constraints • Example: Answering teacher’s questions • RESTRICED TO RESPOND at someone else’s pace • Your examples?

Fixed Interval (FI) Schedules

• A reinforcer is delivered following the first instance of a response after a fixed period of time • FI 5 min = the first response after 5 minutes is reinforced • On FI schedule, responding increases slowing and becomes more rapid as the end of the interval approaches • The behaviors associated with ‘writing a term paper’ start slowly in the beginning of the term and increase as the end of the term approaches

Fixed Interval Schedules • Responses are followed by a post-reinforcement pause • The person learns that responses that occur immediately after reinforcement are never reinforced • After a big assignment is done, most students take a break before starting the next one • Compared to FR, responding is generally less consistent and less rapid with FI schedules • Checking the letter mail is reinforced on a fixed interval schedule Fixed interval: slower rates / long pauses / extinction slower than for fixed ratio

Fixed Interval (FI) Schedule • Behavior accelerate towards the end of the interval • Post-reinforcement pause

Variable Interval (VI) Schedules • Reinforcement is delivered for the first correct response following variable lengths of time, based on an average • VI 4 min = reinforcement is available for the first response produced after an average of 4 minutes • Mrs. Ho reinforces Fred when she notices that he is “paying attention” after 4, 5, 5, 4, 3, 4, 4, 3, 5, and 4 minutes • VI schedules produce steady, low to moderate response rates with little to no post-reinforcement pause • Produces consistent workers Variable Interval: slow, steady responses / no pauses / slowest extinction

Variable Interval Schedule Examples • Car salesman given bonuses when supervisor checks in. He does not know when it might occur. So he needs to stay continuously engaged in selling behaviors (e.g., walking lot, talking to people, on phone, etc.) in case the supervisor shows up. • Grocery “Stock Boy” should always be busy in case Manager walks by. He never knows when this might occur so the schedule maintains steady responding without pausing.

Schedules with a Limited Hold

Limited Hold • A deadline for meeting the response requirement of a schedule of reinforcement • Can be used with ratio or interval schedules • If responding doesn’t occur by the deadline, reinforcement is not delivered

Fixed Ratio with Limited Hold • FR schedule • Do the 15 problems on your worksheet and you can use the computer • FR schedule with limited hold • Do the 15 problems on your worksheet and use can use the computer for 5 minutes (FR 15/LH 5) • FI schedule • Make your bed after getting up and you can watch TV • FI schedule with limited hold • Make your bed within 5 minutes after getting up and you can watch TV for 30 min (immediately after)

Waiting for the Bus: FI with Limited Hold • The bus comes once every 30 minutes (a fixed interval) • What makes this a limited hold? Fixing the schedule – bus doesn’t wait for you. Reinforce – meet the time because bus be there in the limited of time. Have to be there to meet the time requirement.

Variable Interval with Limited Hold • VI schedule • Dad reinforces the kids for sitting quietly after varying periods of time (VI = 5 min) during a road trip • VI schedule with limited hold • Dad reinforces the kids only if they are sitting quietly within 1 second of the timer sounding (VI 5 min/LH 1 sec)

The Timer Game • “Good” behavior when timer sounds (VI 15 min/LH 1 sec) • Why 1 second? Why not 10 seconds??

Schedules of Reinforcement & Teaching • In ABA (behavior modification programs), ratio schedules (FR/VR) are typically preferred over interval schedules, for a number of practical reasons • Fixed interval (FI) schedules produce long post reinforcement pauses • Variable Interval (VI) schedules result in lower response rates, compared to ratio schedules

• Both FI and VI schedules require continuous monitoring of behavior after the end of each interval until a response occurs

Resistance To Extinction Coke machine How does a gambling example work?

Concurrent Schedules of Reinforcement • Often, we have the choice of engaging in a number of behaviors at the same time • In the evening, YOU may: • Watch TV – schedule? • Watch Netflix - schedule? • Watch YouTube – schedule? • Watch Tik Tok – schedule? • Do homework – schedule? Concurrent Schedules of Reinforcement • When each of two or more behaviors is reinforced on different schedules at the same time, the schedules of reinforcement are concurrent • Behaviors that are reinforced more frequently/often are likely to occur at a higher rate than those reinforced less frequently/often

Pitfalls of Intermittent Reinforcement Unaware-Misapplication Pitfall • Inconsistent use of extinction = leads to bratty behavior! • Initial withholding of reinforcement for problem behavior (e.g., don’t give candy when child screams in grocery store line up) • As child persists in problem behavior (e.g., screaming becomes more intense, escalates to hitting, continues for more than 1 minute), parent ‘gives up’ and delivers reinforcement (e.g., candy) • Will result in further persistent problem behavior in the future (persistence was reinforced) • This is a misapplication of which schedule?

Let’s Play… What’s the Schedule? Frequent flyer program: you get a free flight after accumulating 50,000 flight miles, that must be used within 12 months. – fixed ratio – reinforce a limited of time Fly fishing and catching fish – variable ratio – occasion of behavior Receiving a good grade each quiz for every chapter studied. – fixed ratio – each chapter/each week Athlete signs a contract whereby his salary increases are renegotiated every three years – next reinforcement/negotiable will be changed when time ended – fixed interval

Playing bingo – variable ratio – reinforce by the schedule – count of playing/ gambling. Not all of the time. • Door-to-door sales peron going door to door looking for people to sell to. Variable ratio – schedule/ number of doors you knock. Not the time. Not predictable Waiting for a cab – variable interval – based on time when the cab available Going up a staircase, you must go up the same number of stairs to get to the landing – fixed ratio – number of stair/based on number Return a specific number of cans and bottles for payment – fixed ratio – not time, based on how many cans or bottles to receive the payment Number of throws to get a strike when bowling – variable ratio – number of behaviour that occur. Happened some time andfix unpredictable Boarding a West Jet flight – fixed interval – publish written schedule. Fix the schedule of the boarding behavior. When plane is taking off/time based....


Similar Free PDFs