In operant conditioning, a variable-ratio schedule is a schedule of reinforcement where a response is reinforced after an unpredictable number of responses.﻿﻿ This schedule creates a steady, high rate of responding. Gambling and lottery games are good examples of a reward based on a variable ratio schedule.

Schedules of reinforcement play a central role in the operant conditioning process.﻿﻿ The frequency with which a behavior is reinforced can help determine how quickly a response is learned as well as how strong the response might be. Each schedule of reinforcement has its own unique set of characteristics.

Illustration by Brianna Gilmartin, Verywell

Characteristics

There are three common, well-known factors:

Leads to a high, steady response rate

Results in only a brief pause after reinforcement

Rewards are provided after an unpredictable number of responses

When identifying different schedules of reinforcement, it can be very helpful to start by looking at the name of the individual schedule itself. In the case of variable-ratio schedules, the term variable indicates that reinforcement is delivered after an unpredictable number of responses.﻿﻿ Ratio suggests that the reinforcement is given after a set number of responses. So together, the term means that reinforcement is delivered after a varied number of responses.

It might also be helpful to contrast the variable-ratio schedule of reinforcement with the fixed-ratio schedule of reinforcement. In a fixed-ratio schedule, reinforcement is provided after a set number of responses.

So, for example, in a variable-ratio schedule with a VR 5 schedule, an animal might receive a reward for every five response, on average. This means that sometimes the reward can come after three responses, sometimes after seven responses, sometimes after five responses, and so on. The reinforcement schedule will average out to be rewarded for every five response, but the actual delivery schedule will remain completely unpredictable.

In a fixed-ratio schedule, on the other hand, the reinforcement schedule might be set at a FR 5. This would mean that for every five responses, a reward is presented. Where the variable-ratio schedule is unpredictable, the fixed-ratio schedule is set at a fixed rate.﻿﻿

Variable-Ratio Schedule Reinforcement provided after varying set of responses

Delivery schedule unpredictable

Examples include slot machines, door-to-door sales, video games Fixed-Ratio Schedule Reinforcement provided after a set number of responses

Delivery schedule predictable

Examples include production line work, grade card rewards, sales commissions

Examples