Interval Schedules of Reinforcement

There are two basic types of interval schedules.

A Fixed Interval Schedule provides a reward at consistent times. Forexample a child may be rewarded once a week if their room is cleaned up. Aproblem with this type of reinforcement schedule is that individuals tend to wait until the time when reinforcement will occur and thenbegin their responses (Nye, 1992). Because of this reinforcement, output doesnot remain constant. In the example given above, the child's room may be amess all week, but is cleaned up for the "inspection".

Examples

1. A salaried work is not completely controlled by the salary because ofthe existence of many other conditions in the job environment.

2. Teacher schedules exams or projects at regular intervals and the gradeis the reinforcer, but the work is inconsistnet during the interval betweentests.

A Variable Interval Schedule provides reinforcement after random timeintervals. This enforces persistence in the behavior over a long period of time. Becauserewards are dispensed over a period of time, they average out, but within thatperiod rewards are dispensed unevenly (Carpenter, 1974). For example, youmight check the child's room on a randon schedule, they she would never knowwhen you would check, so the room would remain picked up.

Examples

1. A teacher who gives surprise quizes or who calls on students to answeroral questions on the average of once every third day.

2. A pigeon will maintain a constant rate of pecking, with little pausingto consume its reinforcers.