Title: Operant Conditioning Schedules
1Chapter 6
- Operant Conditioning Schedules
2Schedule of Reinforcement
- Appetitive outcome --gt reinforcement
- As a shorthand we call the appetitive outcome
the reinforcer - Assume that weve got something appetitive and
motivating for each individual subject - Fairly consistent patterns of behaviour
- Cumulative recorder
3Cumulative Record
- Cumulative recorder
- Flat line
- Slope
4Cumulative Recorder
pen
paper strip
5Recording Responses
6The Accumulation of the Cumulative Record
VI-25
7Fixed Ratio (FR)
- N responses required e.g., FR 25
- CRF FR1
- Rise-and-run
- Postreinforcement pause
- Ratio strain
8Variable Ratio (VR)
- Varies around mean number of responses e.g., VR
25 - Short, if any postreinforcement pause
- Never know which response will be reinforced
9Fixed Interval (FI)
- Depends on time e.g., FI 25
- Postreinforcement pause scalloping
- Clock doesnt start until reinforcer given
10Variable Interval (VI)
- Varies around mean time e.g., VI 25
- Dont know when time has elapsed
- Clock doesnt start until reinforcer given
11Response Rates
12Duration Schedules
- Continuous responding for some time period to
receive reinforcement - Fixed duration (FD)
- Set time period
- Variable duration (VD)
- Varies around a mean
13Differential Rate Schedules
- Differential reinforcement of low rates (DRL)
- Reinforcement only if X amount of time has passed
since last response - Sometimes superstitious behaviours
- Differential reinforcement of high rates (DRH)
- Reinforcement only if more than X responses in a
set time
14Noncontingent Schedules
- Reinforcement delivery not contingent upon
passage of time - Fixed time (FT)
- After set time elapses
- Variable time (VT)
- After variable time elapses
15Choice Behaviour
16Choice
- Two-key procedure
- Concurrent schedules of reinforcement
- Each key associated with separate schedule
- Distribution of time and behaviour
17Concurrent Ratio Schedules
- Two ratio schedules
- Schedule that gives most rapid reinforcement
chosen exclusively
18Concurrent Interval Schedules
- Maximize reinforcement
- Must shift between alternatives
- Allows for study of choice behaviour
19Interval Schedules
- FI-FI
- Steady-state responding
- Less useful/interesting
- VI-VI
- Not steady-state responding
- Respond to both alternatives
- Sensitive to rate of reinforcemenet
- Most commonly used to study choice
20Alternation and the Changeover Response
- Maximize reinforcers from both alternatives
- Frequent shifting becomes reinforcing
- Simple alternation
- Concurrent superstition
21Changeover Delay
- COD
- Prevents rapid switching
- Time delay after changeover before
reinforcement possible
22Herrnsteins (1961) Experiment
- Concurrent VI-VI schedules
- Overall rates of reinforcement held constant
- 40 reinforcers/hour between two alternatives
23Key
Schedule
Rft/hr
Rsp/hr
Rft rate
Rsp rate
24The Matching Law
- The proportion of responses directed toward one
alternative should equal the proportion of
reinforcers delivered by that alternative.
25Bias
- Spend more time on one alternative than predicted
- Side preferences
- Biological predispositions
- Quality and amount
26Varying Quality of Reinforcers
- Q1 quality of first reinforcer
- Q2 quality of second reinforcer
27Varying Amount of Reinforcers
- A1 amount of first reinforcer
- A2 amount of second reinforcer
28Combining Qualities and Amounts
29Extinction
30Extinction
- Disrupt the three-term contingency
- Response rate decreases
31Stretching the Ratio/Interval
- Increasing the number of responses
- e.g., FR 5 --gt FR 50, VI 4 sec. --gt VI 30 sec.
- Extinction problem
- Shaping gradual increments
- Low or high schedules
32Extinction
- Continuous Reinforcement (CRF) FR 1
- Intermittent schedule everything else
- CRF easier to extinguish than any intermittent
schedules - Partial reinforcement effect (PRE)
- Generally
- High vs. low
- Variable vs. fixed
33Discrimination Hypothesis
- Difficult to discriminate between extinction and
intermittent schedule - High schedules more like extinction than low
schedules - e.g., CRF vs. FR 50
34Frustration Hypothesis
- Non-reinforcement for response is frustrating
- On CRF every response reinforced no frustration
- Intermittent schedules always have some
non-reinforced responses - Responding leads to reinforcer (pos. reinf.)
- Frustration S for reinforcement
- Frustration grows continually during extinction
- Stop responding --gt stops frustration (neg.
reinf.)
35Sequential Hypothesis
- Response followed by reinf. or nonreinf.
- Intermittent schedules nonreinforced responses
are S for eventual delivery of reinforcer - High schedules increase resistance to extinction
because many nonreinforced responses in a row
leads to reinforced - Extinction similar to high schedule
36Response Unit Hypothesis
- Think in terms of behavioural units
- FR1 1 response 1 unit --gt reinforcement
- FR2 2 responses 1 unit --gt reinforcement
- Not response-failure, response-reinforcer but
response-response-reinforcer - Says PRE is an artifact
37Mowrer Jones (1945)
300 250 200 150 100 50
- Response unit hypothesis
- More responses in extinction on higher schedules
disappears when considered as behavioural units
Number of responses/units during extinction
FR1 FR2 FR3 FR4
absolute number of responses number of
behavioural units
38Economic Concepts and Operant Behaviour
- Similarities
- Application of economic theories to behavioural
conditions
39The Economic Analogy
- Responses or time money
- Total responses or time possible income
- Schedule price
40Consumer Demand
- Demand curve
- Price of something and how much is purchased
- Elasticity of demand
41Three Factors in Elasticity of Demand
- 1. Availability of substitutes
- Cant substitute complementary reinforcers
- e.g., food and water
- Can substitute non-complementary reinforcers
- e.g., Coke and Pepsi
- 2. Price range
- e.g., FR3 to FR5 vs. FR30 to FR50
42- 3. Income level
- Higher total response/timethe less effect cost
increases have - Increased income --gt purchase luxury items
- Shurtleff et al. (1987)
- Two VI schedules food, saccharin water
- High schedules rats spend most time on food
lever - Low schedules rats increase time on saccharin
lever
43Behavioural Economics and Drug Abuse
- Addictive drugs
- Nonhuman animal models
- Elasticity
- Work for drug reinforcer on FR schedule
- Inelastic...up to a point
44- Elsmore, et al. (1980)
- Baboons
- Food and heroin
- Availability of substitutes