The description provided refers to a schedule of reinforcement that operates by delivering reinforcement after unpredictable periods. In this context, a Variable Interval (VI) schedule means that the time between reinforcements varies but maintains a consistent average over time. This type of schedule keeps the individual engaged by reducing the predictability of when reinforcement will occur, thus encouraging persistent behavior over time.
For example, a person might receive a reward for checking their email at random intervals, reinforcing the behavior of checking for new messages without the individual knowing exactly when a reward will be given. This unpredictability fosters sustained engagement in the behavior, as the individual is incentivized to continue checking.
In contrast, Fixed Interval (FI) schedules provide reinforcement after a predetermined time, Fixed Ratio (FR) schedules deliver reinforcement after a set number of responses, and Variable Ratio (VR) schedules provide reinforcement after a varying number of responses, which are based on a different contingency than time. Each of these schedules affects behavior in distinct ways that differ from the Variable Interval schedule.