Skinner’s Box Experiment (Behaviorism Study)

practical psychology logo

We receive rewards and punishments for many behaviors. More importantly, once we experience that reward or punishment, we are likely to perform (or not perform) that behavior again in anticipation of the result. 

Psychologists in the late 1800s and early 1900s believed that rewards and punishments were crucial to shaping and encouraging voluntary behavior. But they needed a way to test it. And they needed a name for how rewards and punishments shaped voluntary behaviors. Along came Burrhus Frederic Skinner , the creator of Skinner's Box, and the rest is history.

BF Skinner

What Is Skinner's Box?

The "Skinner box" is a setup used in animal experiments. An animal is isolated in a box equipped with levers or other devices in this environment. The animal learns that pressing a lever or displaying specific behaviors can lead to rewards or punishments.

This setup was crucial for behavioral psychologist B.F. Skinner developed his theories on operant conditioning. It also aided in understanding the concept of reinforcement schedules.

Here, "schedules" refer to the timing and frequency of rewards or punishments, which play a key role in shaping behavior. Skinner's research showed how different schedules impact how animals learn and respond to stimuli.

Who is B.F. Skinner?

Burrhus Frederic Skinner, also known as B.F. Skinner is considered the “father of Operant Conditioning.” His experiments, conducted in what is known as “Skinner’s box,” are some of the most well-known experiments in psychology. They helped shape the ideas of operant conditioning in behaviorism.

Law of Effect (Thorndike vs. Skinner) 

At the time, classical conditioning was the top theory in behaviorism. However, Skinner knew that research showed that voluntary behaviors could be part of the conditioning process. In the late 1800s, a psychologist named Edward Thorndike wrote about “The Law of Effect.” He said, “Responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation.”

Thorndike tested out The Law of Effect with a box of his own. The box contained a maze and a lever. He placed a cat inside the box and a fish outside the box. He then recorded how the cats got out of the box and ate the fish. 

Thorndike noticed that the cats would explore the maze and eventually found the lever. The level would let them out of the box, leading them to the fish faster. Once discovering this, the cats were more likely to use the lever when they wanted to get fish. 

Skinner took this idea and ran with it. We call the box where animal experiments are performed "Skinner's box."

Why Do We Call This Box the "Skinner Box?"

Edward Thorndike used a box to train animals to perform behaviors for rewards. Later, psychologists like Martin Seligman used this apparatus to observe "learned helplessness." So why is this setup called a "Skinner Box?" Skinner not only used Skinner box experiments to show the existence of operant conditioning, but he also showed schedules in which operant conditioning was more or less effective, depending on your goals. And that is why he is called The Father of Operant Conditioning.

Skinner's Box Example

How Skinner's Box Worked

Inspired by Thorndike, Skinner created a box to test his theory of Operant Conditioning. (This box is also known as an “operant conditioning chamber.”)

The box was typically very simple. Skinner would place the rats in a Skinner box with neutral stimulants (that produced neither reinforcement nor punishment) and a lever that would dispense food. As the rats started to explore the box, they would stumble upon the level, activate it, and get food. Skinner observed that they were likely to engage in this behavior again, anticipating food. In some boxes, punishments would also be administered. Martin Seligman's learned helplessness experiments are a great example of using punishments to observe or shape an animal's behavior. Skinner usually worked with animals like rats or pigeons. And he took his research beyond what Thorndike did. He looked at how reinforcements and schedules of reinforcement would influence behavior. 

About Reinforcements

Reinforcements are the rewards that satisfy your needs. The fish that cats received outside of Thorndike’s box was positive reinforcement. In Skinner box experiments, pigeons or rats also received food. But positive reinforcements can be anything added after a behavior is performed: money, praise, candy, you name it. Operant conditioning certainly becomes more complicated when it comes to human reinforcements.

Positive vs. Negative Reinforcements 

Skinner also looked at negative reinforcements. Whereas positive reinforcements are given to subjects, negative reinforcements are rewards in the form of things taken away from subjects. In some experiments in the Skinner box, he would send an electric current through the box that would shock the rats. If the rats pushed the lever, the shocks would stop. The removal of that terrible pain was a negative reinforcement. The rats still sought the reinforcement but were not gaining anything when the shocks ended. Skinner saw that the rats quickly learned to turn off the shocks by pushing the lever. 

About Punishments

Skinner's Box also experimented with positive or negative punishments, in which harmful or unsatisfying things were taken away or given due to "bad behavior." For now, let's focus on the schedules of reinforcement.

Schedules of Reinforcement 

Operant Conditioning Example

We know that not every behavior has the same reinforcement every single time. Think about tipping as a rideshare driver or a barista at a coffee shop. You may have a string of customers who tip you generously after conversing with them. At this point, you’re likely to converse with your next customer. But what happens if they don’t tip you after you have a conversation with them? What happens if you stay silent for one ride and get a big tip? 

Psychologists like Skinner wanted to know how quickly someone makes a behavior a habit after receiving reinforcement. Aka, how many trips will it take for you to converse with passengers every time? They also wanted to know how fast a subject would stop conversing with passengers if you stopped getting tips. If the rat pulls the lever and doesn't get food, will they stop pulling the lever altogether?

Skinner attempted to answer these questions by looking at different schedules of reinforcement. He would offer positive reinforcements on different schedules, like offering it every time the behavior was performed (continuous reinforcement) or at random (variable ratio reinforcement.) Based on his experiments, he would measure the following:

  • Response rate (how quickly the behavior was performed)
  • Extinction rate (how quickly the behavior would stop) 

He found that there are multiple schedules of reinforcement, and they all yield different results. These schedules explain why your dog may not be responding to the treats you sometimes give him or why gambling can be so addictive. Not all of these schedules are possible, and that's okay, too.

Continuous Reinforcement

If you reinforce a behavior repeatedly, the response rate is medium, and the extinction rate is fast. The behavior will be performed only when reinforcement is needed. As soon as you stop reinforcing a behavior on this schedule, the behavior will not be performed.

Fixed-Ratio Reinforcement

Let’s say you reinforce the behavior every fourth or fifth time. The response rate is fast, and the extinction rate is medium. The behavior will be performed quickly to reach the reinforcement. 

Fixed-Interval Reinforcement

In the above cases, the reinforcement was given immediately after the behavior was performed. But what if the reinforcement was given at a fixed interval, provided that the behavior was performed at some point? Skinner found that the response rate is medium, and the extinction rate is medium. 

Variable-Ratio Reinforcement

Here's how gambling becomes so unpredictable and addictive. In gambling, you experience occasional wins, but you often face losses. This uncertainty keeps you hooked, not knowing when the next big win, or dopamine hit, will come. The behavior gets reinforced randomly. When gambling, your response is quick, but it takes a long time to stop wanting to gamble. This randomness is a key reason why gambling is highly addictive.

Variable-Interval Reinforcement

Last, the reinforcement is given out at random intervals, provided that the behavior is performed. Health inspectors or secret shoppers are commonly used examples of variable-interval reinforcement. The reinforcement could be administered five minutes after the behavior is performed or seven hours after the behavior is performed. Skinner found that the response rate for this schedule is fast, and the extinction rate is slow. 

Skinner's Box and Pigeon Pilots in World War II

Yes, you read that right. Skinner's work with pigeons and other animals in Skinner's box had real-life effects. After some time training pigeons in his boxes, B.F. Skinner got an idea. Pigeons were easy to train. They can see very well as they fly through the sky. They're also quite calm creatures and don't panic in intense situations. Their skills could be applied to the war that was raging on around him.

B.F. Skinner decided to create a missile that pigeons would operate. That's right. The U.S. military was having trouble accurately targeting missiles, and B.F. Skinner believed pigeons could help. He believed he could train the pigeons to recognize a target and peck when they saw it. As the pigeons pecked, Skinner's specially designed cockpit would navigate appropriately. Pigeons could be pilots in World War II missions, fighting Nazi Germany.

When Skinner proposed this idea to the military, he was met with skepticism. Yet, he received $25,000 to start his work on "Project Pigeon." The device worked! Operant conditioning trained pigeons to navigate missiles appropriately and hit their targets. Unfortunately, there was one problem. The mission killed the pigeons once the missiles were dropped. It would require a lot of pigeons! The military eventually passed on the project, but cockpit prototypes are on display at the American History Museum. Pretty cool, huh?

Examples of Operant Conditioning in Everyday Life

Not every example of operant conditioning has to end in dropping missiles. Nor does it have to happen in a box in a laboratory! You might find that you have used operant conditioning on yourself, a pet, or a child whose behavior changes with rewards and punishments. These operant conditioning examples will look into what this process can do for behavior and personality.

Hot Stove: If you put your hand on a hot stove, you will get burned. More importantly, you are very unlikely to put your hand on that hot stove again. Even though no one has made that stove hot as a punishment, the process still works.

Tips: If you converse with a passenger while driving for Uber, you might get an extra tip at the end of your ride. That's certainly a great reward! You will likely keep conversing with passengers as you drive for Uber. The same type of behavior applies to any service worker who gets tips!

Training a Dog: If your dog sits when you say “sit,” you might treat him. More importantly, they are likely to sit when you say, “sit.” (This is a form of variable-ratio reinforcement. Likely, you only treat your dog 50-90% of the time they sit. If you gave a dog a treat every time they sat, they probably wouldn't have room for breakfast or dinner!)

Operant Conditioning Is Everywhere!

We see operant conditioning training us everywhere, intentionally or unintentionally! Game makers and app developers design their products based on the "rewards" our brains feel when seeing notifications or checking into the app. Schoolteachers use rewards to control their unruly classes. Dog training doesn't always look different from training your child to do chores. We know why this happens, thanks to experiments like the ones performed in Skinner's box. 

Related posts:

  • Operant Conditioning (Examples + Research)
  • Edward Thorndike (Psychologist Biography)
  • Schedules of Reinforcement (Examples)
  • B.F. Skinner (Psychologist Biography)
  • Fixed Ratio Reinforcement Schedule (Examples)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

Operant Conditioning: What It Is, How It Works, and Examples

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Operant conditioning, or instrumental conditioning, is a theory of learning where behavior is influenced by its consequences. Behavior that is reinforced (rewarded) will likely be repeated, and behavior that is punished will occur less frequently.

By the 1920s, John B. Watson had left academic psychology, and other behaviorists were becoming influential, proposing new forms of learning other than classical conditioning . Perhaps the most important of these was Burrhus Frederic Skinner. Although, for obvious reasons, he is more commonly known as B.F. Skinner.

Skinner’s views were slightly less extreme than Watson’s (1913). Skinner believed that we do have such a thing as a mind, but that it is simply more productive to study observable behavior rather than internal mental events.

Skinner’s work was rooted in the view that classical conditioning was far too simplistic to fully explain complex human behavior. He believed that the best way to understand behavior is to examine its causes and consequences. He called this approach operant conditioning.

operant Conditioning quick facts

How It Works

Skinner is regarded as the father of Operant Conditioning, but his work was based on Thorndike’s (1898) Law of Effect . According to this principle, behavior that is followed by pleasant consequences is likely to be repeated, and behavior followed by unpleasant consequences is less likely to be repeated.

Skinner introduced a new term into the Law of Effect – Reinforcement. Behavior that is reinforced tends to be repeated (i.e., strengthened); behavior that is not reinforced tends to die out or be extinguished (i.e., weakened).

Skinner (1948) studied operant conditioning by conducting experiments using animals, which he placed in a “ Skinner Box, ” which was similar to Thorndike’s puzzle box.

Skinner box or operant conditioning chamber experiment outline diagram. Labeled educational laboratory apparatus structure for mouse or rat experiment to understand animal behavior vector illustration

A Skinner box, also known as an operant conditioning chamber, is a device used to objectively record an animal’s behavior in a compressed time frame. An animal can be rewarded or punished for engaging in certain behaviors, such as lever pressing (for rats) or key pecking (for pigeons).

Skinner identified three types of responses, or operant, that can follow behavior.

  • Neutral operants : Responses from the environment that neither increase nor decrease the probability of a behavior being repeated.
  • Reinforcers : Responses from the environment that increase the probability of a behavior being repeated. Reinforcers can be either positive or negative.
  • Punishers : Responses from the environment that decrease the likelihood of a behavior being repeated. Punishment weakens behavior.

We can all think of examples of how reinforcers and punishers have affected our behavior. As a child, you probably tried out a number of behaviors and learned from their consequences.

For example, when you were younger, if you tried smoking at school, and the chief consequence was that you got in with the crowd you always wanted to hang out with, you would have been positively reinforced (i.e., rewarded) and would be likely to repeat the behavior.

If, however, the main consequence was that you were caught, caned, suspended from school, and your parents became involved, you would most certainly have been punished, and you would consequently be much less likely to smoke now.

Positive Reinforcement

B. F. Skinner’s theory of operant conditioning describes positive reinforcement. In positive reinforcement, a response or behavior is strengthened by rewards, leading to the repetition of the desired behavior. The reward is a reinforcing stimulus.

Primary reinforcers are stimuli that are naturally reinforcing because they are not learned and directly satisfy a need, such as food or water.

Secondary reinforcers are stimuli that are reinforced through their association with a primary reinforcer, such as money, school grades. They do not directly satisfy an innate need but may be the means.  So a secondary reinforcer can be just as powerful a motivator as a primary reinforcer.

Skinner showed how positive reinforcement worked by placing a hungry rat in his Skinner box. The box contained a lever on the side, and as the rat moved about the box, it would accidentally knock the lever. Immediately, it did so that a food pellet would drop into a container next to the lever.

After being put in the box a few times, the rats quickly learned to go straight to the lever. The consequence of receiving food if they pressed the lever ensured that they would repeat the action again and again.

Positive reinforcement strengthens a behavior by providing a consequence an individual finds rewarding. For example, if your teacher gives you £5 each time you complete your homework (i.e., a reward), you will be more likely to repeat this behavior in the future, thus strengthening the behavior of completing your homework.

The Premack principle is a form of positive reinforcement in operant conditioning. It suggests using a preferred activity (high-probability behavior) as a reward for completing a less preferred one (low-probability behavior).

This method incentivizes the less desirable behavior by associating it with a desirable outcome, thus strengthening the less favored behavior.

Operant Conditioning Reinforcement 1

Negative Reinforcement

Negative reinforcement is the termination of an unpleasant state following a response.

This is known as negative reinforcement because it is the removal of an adverse stimulus which is ‘rewarding’ to the animal or person. Negative reinforcement strengthens behavior because it stops or removes an unpleasant experience.

For example, if you do not complete your homework, you give your teacher £5. You will complete your homework to avoid paying £5, thus strengthening the behavior of completing your homework.

Skinner showed how negative reinforcement worked by placing a rat in his Skinner box and then subjecting it to an unpleasant electric current which caused it some discomfort. As the rat moved about the box it would accidentally knock the lever.

Immediately, it did so the electric current would be switched off. The rats quickly learned to go straight to the lever after being put in the box a few times. The consequence of escaping the electric current ensured that they would repeat the action again and again.

In fact, Skinner even taught the rats to avoid the electric current by turning on a light just before the electric current came on. The rats soon learned to press the lever when the light came on because they knew that this would stop the electric current from being switched on.

These two learned responses are known as Escape Learning and Avoidance Learning .

Punishment is the opposite of reinforcement since it is designed to weaken or eliminate a response rather than increase it. It is an aversive event that decreases the behavior that it follows.

Like reinforcement, punishment can work either by directly applying an unpleasant stimulus like a shock after a response or by removing a potentially rewarding stimulus, for instance, deducting someone’s pocket money to punish undesirable behavior.

Note : It is not always easy to distinguish between punishment and negative reinforcement.

They are two distinct methods of punishment used to decrease the likelihood of a specific behavior occurring again, but they involve different types of consequences:

Positive Punishment :

  • Positive punishment involves adding an aversive stimulus or something unpleasant immediately following a behavior to decrease the likelihood of that behavior happening in the future.
  • It aims to weaken the target behavior by associating it with an undesirable consequence.
  • Example : A child receives a scolding (an aversive stimulus) from their parent immediately after hitting their sibling. This is intended to decrease the likelihood of the child hitting their sibling again.

Negative Punishment :

  • Negative punishment involves removing a desirable stimulus or something rewarding immediately following a behavior to decrease the likelihood of that behavior happening in the future.
  • It aims to weaken the target behavior by taking away something the individual values or enjoys.
  • Example : A teenager loses their video game privileges (a desirable stimulus) for not completing their chores. This is intended to decrease the likelihood of the teenager neglecting their chores in the future.
There are many problems with using punishment, such as:
  • Punished behavior is not forgotten, it’s suppressed – behavior returns when punishment is no longer present.
  • Causes increased aggression – shows that aggression is a way to cope with problems.
  • Creates fear that can generalize to undesirable behaviors, e.g., fear of school.
  • Does not necessarily guide you toward desired behavior – reinforcement tells you what to do, and punishment only tells you what not to do.

Examples of Operant Conditioning

Positive Reinforcement : Suppose you are a coach and want your team to improve their passing accuracy in soccer. When the players execute accurate passes during training, you praise their technique. This positive feedback encourages them to repeat the correct passing behavior.

Negative Reinforcement : If you notice your team working together effectively and exhibiting excellent team spirit during a tough training session, you might end the training session earlier than planned, which the team perceives as a relief. They understand that teamwork leads to positive outcomes, reinforcing team behavior.

Negative Punishment : If an office worker continually arrives late, their manager might revoke the privilege of flexible working hours. This removal of a positive stimulus encourages the employee to be punctual.

Positive Reinforcement : Training a cat to use a litter box can be achieved by giving it a treat each time it uses it correctly. The cat will associate the behavior with the reward and will likely repeat it.

Negative Punishment : If teenagers stay out past their curfew, their parents might take away their gaming console for a week. This makes the teenager more likely to respect their curfew in the future to avoid losing something they value.

Ineffective Punishment : Your child refuses to finish their vegetables at dinner. You punish them by not allowing dessert, but the child still refuses to eat vegetables next time. The punishment seems ineffective.

Premack Principle Application : You could motivate your child to eat vegetables by offering an activity they love after they finish their meal. For instance, for every vegetable eaten, they get an extra five minutes of video game time. They value video game time, which might encourage them to eat vegetables.

Other Premack Principle Examples :

  • A student who dislikes history but loves art might earn extra time in the art studio for each history chapter reviewed.
  • For every 10 minutes a person spends on household chores, they can spend 5 minutes on a favorite hobby.
  • For each successful day of healthy eating, an individual allows themselves a small piece of dark chocolate at the end of the day.
  • A child can choose between taking out the trash or washing the dishes. Giving them the choice makes them more likely to complete the chore willingly.

Skinner’s Pigeon Experiment

B.F. Skinner conducted several experiments with pigeons to demonstrate the principles of operant conditioning.

One of the most famous of these experiments is often colloquially referred to as “ Superstition in the Pigeon .”

This experiment was conducted to explore the effects of non-contingent reinforcement on pigeons, leading to some fascinating observations that can be likened to human superstitions.

Non-contingent reinforcement (NCR) refers to a method in which rewards (or reinforcements) are delivered independently of the individual’s behavior. In other words, the reinforcement is given at set times or intervals, regardless of what the individual is doing.

The Experiment:

  • Pigeons were brought to a state of hunger, reduced to 75% of their well-fed weight.
  • They were placed in a cage with a food hopper that could be presented for five seconds at a time.
  • Instead of the food being given as a result of any specific action by the pigeon, it was presented at regular intervals, regardless of the pigeon’s behavior.

Observation:

  • Over time, Skinner observed that the pigeons began to associate whatever random action they were doing when food was delivered with the delivery of the food itself.
  • This led the pigeons to repeat these actions, believing (in anthropomorphic terms) that their behavior was causing the food to appear.
  • In most cases, pigeons developed different “superstitious” behaviors or rituals. For instance, one pigeon would turn counter-clockwise between food presentations, while another would thrust its head into a cage corner.
  • These behaviors did not appear until the food hopper was introduced and presented periodically.
  • These behaviors were not initially related to the food delivery but became linked in the pigeon’s mind due to the coincidental timing of the food dispensing.
  • The behaviors seemed to be associated with the environment, suggesting the pigeons were responding to certain aspects of their surroundings.
  • The rate of reinforcement (how often the food was presented) played a significant role. Shorter intervals between food presentations led to more rapid and defined conditioning.
  • Once a behavior was established, the interval between reinforcements could be increased without diminishing the behavior.

Superstitious Behavior:

The pigeons began to act as if their behaviors had a direct effect on the presentation of food, even though there was no such connection. This is likened to human superstitions, where rituals are believed to change outcomes, even if they have no real effect.

For example, a card player might have rituals to change their luck, or a bowler might make gestures believing they can influence a ball already in motion.

Conclusion:

This experiment demonstrates that behaviors can be conditioned even without a direct cause-and-effect relationship. Just like humans, pigeons can develop “superstitious” behaviors based on coincidental occurrences.

This study not only illuminates the intricacies of operant conditioning but also draws parallels between animal and human behaviors in the face of random reinforcements.

Schedules of Reinforcement

Imagine a rat in a “Skinner box.” In operant conditioning, if no food pellet is delivered immediately after the lever is pressed, then after several attempts, the rat stops pressing the lever (how long would someone continue to go to work if their employer stopped paying them?). The behavior has been extinguished.

Behaviorists discovered that different patterns (or schedules) of reinforcement had different effects on the speed of learning and extinction. Ferster and Skinner (1957) devised different ways of delivering reinforcement and found that this had effects on

1. The Response Rate – The rate at which the rat pressed the lever (i.e., how hard the rat worked).

2. The Extinction Rate – The rate at which lever pressing dies out (i.e., how soon the rat gave up).

How Reinforcement Schedules Work

Skinner found that variable-ratio reinforcement produces the slowest rate of extinction (i.e., people will continue repeating the behavior for the longest time without reinforcement). The type of reinforcement with the quickest rate of extinction is continuous reinforcement.

(A) Continuous Reinforcement

An animal or human is positively reinforced every time a specific behavior occurs, e.g., every time a lever is pressed, a pellet is delivered, and then food delivery is shut off.

  • Response rate is SLOW
  • Extinction rate is FAST

(B) Fixed Ratio Reinforcement

Behavior is reinforced only after the behavior occurs a specified number of times. e.g., one reinforcement is given after every so many correct responses, e.g., after every 5th response. For example, a child receives a star for every five words spelled correctly.

  • Response rate is FAST
  • Extinction rate is MEDIUM

(C) Fixed Interval Reinforcement

One reinforcement is given after a fixed time interval providing at least one correct response has been made. An example is being paid by the hour. Another example would be every 15 minutes (half hour, hour, etc.) a pellet is delivered (providing at least one lever press has been made) then food delivery is shut off.

  • Response rate is MEDIUM

(D) Variable Ratio Reinforcement

behavior is reinforced after an unpredictable number of times. For example, gambling or fishing.

  • Extinction rate is SLOW (very hard to extinguish because of unpredictability)

(E) Variable Interval Reinforcement

Providing one correct response has been made, reinforcement is given after an unpredictable amount of time has passed, e.g., on average every 5 minutes. An example is a self-employed person being paid at unpredictable times.

  • Extinction rate is SLOW

Applications In Psychology

1. behavior modification therapy.

Behavior modification is a set of therapeutic techniques based on operant conditioning (Skinner, 1938, 1953). The main principle comprises changing environmental events that are related to a person’s behavior. For example, the reinforcement of desired behaviors and ignoring or punishing undesired ones.

This is not as simple as it sounds — always reinforcing desired behavior, for example, is basically bribery.

There are different types of positive reinforcements. Primary reinforcement is when a reward strengths a behavior by itself. Secondary reinforcement is when something strengthens a behavior because it leads to a primary reinforcer.

Examples of behavior modification therapy include token economy and behavior shaping.

Token Economy

Token economy is a system in which targeted behaviors are reinforced with tokens (secondary reinforcers) and later exchanged for rewards (primary reinforcers).

Tokens can be in the form of fake money, buttons, poker chips, stickers, etc. While the rewards can range anywhere from snacks to privileges or activities. For example, teachers use token economy at primary school by giving young children stickers to reward good behavior.

Token economy has been found to be very effective in managing psychiatric patients . However, the patients can become over-reliant on the tokens, making it difficult for them to adjust to society once they leave prison, hospital, etc.

Staff implementing a token economy program have a lot of power. It is important that staff do not favor or ignore certain individuals if the program is to work. Therefore, staff need to be trained to give tokens fairly and consistently even when there are shift changes such as in prisons or in a psychiatric hospital.

Behavior Shaping

A further important contribution made by Skinner (1951) is the notion of behavior shaping through successive approximation.

Skinner argues that the principles of operant conditioning can be used to produce extremely complex behavior if rewards and punishments are delivered in such a way as to encourage move an organism closer and closer to the desired behavior each time.

In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by rewarding exact segments of behavior.

To do this, the conditions (or contingencies) required to receive the reward should shift each time the organism moves a step closer to the desired behavior.

According to Skinner, most animal and human behavior (including language) can be explained as a product of this type of successive approximation.

2. Educational Applications

In the conventional learning situation, operant conditioning applies largely to issues of class and student management, rather than to learning content. It is very relevant to shaping skill performance.

A simple way to shape behavior is to provide feedback on learner performance, e.g., compliments, approval, encouragement, and affirmation.

A variable-ratio produces the highest response rate for students learning a new task, whereby initial reinforcement (e.g., praise) occurs at frequent intervals, and as the performance improves reinforcement occurs less frequently, until eventually only exceptional outcomes are reinforced.

For example, if a teacher wanted to encourage students to answer questions in class they should praise them for every attempt (regardless of whether their answer is correct). Gradually the teacher will only praise the students when their answer is correct, and over time only exceptional answers will be praised.

Unwanted behaviors, such as tardiness and dominating class discussion can be extinguished through being ignored by the teacher (rather than being reinforced by having attention drawn to them). This is not an easy task, as the teacher may appear insincere if he/she thinks too much about the way to behave.

Knowledge of success is also important as it motivates future learning. However, it is important to vary the type of reinforcement given so that the behavior is maintained.

This is not an easy task, as the teacher may appear insincere if he/she thinks too much about the way to behave.

Operant Conditioning vs. Classical Conditioning

Learning type.

While both types of conditioning involve learning, classical conditioning is passive (automatic response to stimuli), while operant conditioning is active (behavior is influenced by consequences).

  • Classical conditioning links an involuntary response with a stimulus. It happens passively on the part of the learner, without rewards or punishments. An example is a dog salivating at the sound of a bell associated with food.
  • Operant conditioning connects voluntary behavior with a consequence. Operant conditioning requires the learner to actively participate and perform some type of action to be rewarded or punished. It’s active, with the learner’s behavior influenced by rewards or punishments. An example is a dog sitting on command to get a treat.

Learning Process

Classical conditioning involves learning through associating stimuli resulting in involuntary responses, while operant conditioning focuses on learning through consequences, shaping voluntary behaviors.

Over time, the person responds to the neutral stimulus as if it were the unconditioned stimulus, even when presented alone. The response is involuntary and automatic.

An example is a dog salivating (response) at the sound of a bell (neutral stimulus) after it has been repeatedly paired with food (unconditioned stimulus).

Behavior followed by pleasant consequences (rewards) is more likely to be repeated, while behavior followed by unpleasant consequences (punishments) is less likely to be repeated.

For instance, if a child gets praised (pleasant consequence) for cleaning their room (behavior), they’re more likely to clean their room in the future.

Conversely, if they get scolded (unpleasant consequence) for not doing their homework, they’re more likely to complete it next time to avoid the scolding.

Timing of Stimulus & Response

The timing of the response relative to the stimulus differs between classical and operant conditioning:

Classical Conditioning (response after the stimulus) : In this form of conditioning, the response occurs after the stimulus. The behavior (response) is determined by what precedes it (stimulus). 

For example, in Pavlov’s classic experiment, the dogs started to salivate (response) after they heard the bell (stimulus) because they associated it with food.

The anticipated consequence influences the behavior or what follows it. It is a more active form of learning, where behaviors are reinforced or punished, thus influencing their likelihood of repetition.

For example, a child might behave well (behavior) in anticipation of a reward (consequence), or avoid a certain behavior to prevent a potential punishment.

Looking at Skinner’s classic studies on pigeons’  and rats’ behavior, we can identify some of the major assumptions of the behaviorist approach .

• Psychology should be seen as a science , to be studied in a scientific manner. Skinner’s study of behavior in rats was conducted under carefully controlled laboratory conditions . • Behaviorism is primarily concerned with observable behavior, as opposed to internal events like thinking and emotion. Note that Skinner did not say that the rats learned to press a lever because they wanted food. He instead concentrated on describing the easily observed behavior that the rats acquired. • The major influence on human behavior is learning from our environment. In the Skinner study, because food followed a particular behavior the rats learned to repeat that behavior, e.g., operant conditioning. • There is little difference between the learning that takes place in humans and that in other animals. Therefore research (e.g., operant conditioning) can be carried out on animals (Rats / Pigeons) as well as on humans. Skinner proposed that the way humans learn behavior is much the same as the way the rats learned to press a lever.

So, if your layperson’s idea of psychology has always been of people in laboratories wearing white coats and watching hapless rats try to negotiate mazes to get to their dinner, then you are probably thinking of behavioral psychology.

Behaviorism and its offshoots tend to be among the most scientific of the psychological perspectives . The emphasis of behavioral psychology is on how we learn to behave in certain ways.

We are all constantly learning new behaviors and how to modify our existing behavior. Behavioral psychology is the psychological approach that focuses on how this learning takes place.

Critical Evaluation

Operant conditioning can  explain a wide variety of behaviors, from the learning process to addiction and  language acquisition . It also has practical applications (such as token economy) that can be used in classrooms, prisons,  and psychiatric hospitals.

Researchers have found innovative ways to apply operant conditioning principles to promote health and habit change in humans.

In a recent study, operant conditioning using virtual reality (VR) helped stroke patients use their weakened limb more often during rehabilitation. Patients shifted their weight in VR games by maneuvering a virtual object. When they increased weight on their weakened side, they received rewards like stars. This positive reinforcement conditioned greater paretic limb use (Kumar et al., 2019).

Another study utilized operant conditioning to assist smoking cessation. Participants earned vouchers exchangeable for goods and services for reducing smoking. This reward system reinforced decreasing cigarette use. Many participants achieved long-term abstinence (Dallery et al., 2017).

Through repeated reinforcement, operant conditioning can facilitate forming exercise and eating habits. A person trying to exercise more might earn TV time for every 10 minutes spent working out. An individual aiming to eat healthier may allow themselves a daily dark chocolate square for sticking to nutritious meals. Providing consistent rewards for desired actions can instill new habits (Michie et al., 2009).

Apps like Habitica apply operant conditioning by gamifying habit tracking. Users earn points and collect rewards in a fantasy game for completing real-life habits. This virtual reinforcement helps ingrain positive behaviors (Eckerstorfer et al., 2019).

Operant conditioning also shows promise for managing ADHD and OCD. Rewarding concentration and focus in ADHD children, for example, can strengthen their attention skills (Rosén et al., 2018). Similarly, reinforcing OCD patients for resisting compulsions may diminish obsessive behaviors (Twohig et al., 2018).

However, operant conditioning fails to take into account the role of inherited and cognitive factors in learning, and thus is an incomplete explanation of the learning process in humans and animals.

For example, Kohler (1924) found that primates often seem to solve problems in a flash of insight rather than be trial and error learning. Also, social learning theory (Bandura, 1977) suggests that humans can learn automatically through observation rather than through personal experience.

The use of animal research in operant conditioning studies also raises the issue of extrapolation. Some psychologists argue we cannot generalize from studies on animals to humans as their anatomy and physiology are different from humans, and they cannot think about their experiences and invoke reason, patience, memory or self-comfort.

Frequently Asked Questions

Who discovered operant conditioning.

Operant conditioning was discovered by B.F. Skinner, an American psychologist, in the mid-20th century. Skinner is often regarded as the father of operant conditioning, and his work extensively dealt with the mechanism of reward and punishment for behaviors, with the concept being that behaviors followed by positive outcomes are reinforced, while those followed by negative outcomes are discouraged.

How does operant conditioning differ from classical conditioning?

Operant conditioning differs from classical conditioning, focusing on how voluntary behavior is shaped and maintained by consequences, such as rewards and punishments.

In operant conditioning, a behavior is strengthened or weakened based on the consequences that follow it. In contrast, classical conditioning involves the association of a neutral stimulus with a natural response, creating a new learned response.

While both types of conditioning involve learning and behavior modification, operant conditioning emphasizes the role of reinforcement and punishment in shaping voluntary behavior.

How does operant conditioning relate to social learning theory?

Operant conditioning is a core component of social learning theory , which emphasizes the importance of observational learning and modeling in acquiring and modifying behavior.

Social learning theory suggests that individuals can learn new behaviors by observing others and the consequences of their actions, which is similar to the reinforcement and punishment processes in operant conditioning.

By observing and imitating models, individuals can acquire new skills and behaviors and modify their own behavior based on the outcomes they observe in others.

Overall, both operant conditioning and social learning theory highlight the importance of environmental factors in shaping behavior and learning.

What are the downsides of operant conditioning?

The downsides of using operant conditioning on individuals include the potential for unintended negative consequences, particularly with the use of punishment. Punishment may lead to increased aggression or avoidance behaviors.

Additionally, some behaviors may be difficult to shape or modify using operant conditioning techniques, particularly when they are highly ingrained or tied to complex internal states.

Furthermore, individuals may resist changing their behaviors to meet the expectations of others, particularly if they perceive the demands or consequences of the reinforcement or punishment to be undesirable or unjust.

What is an application of bf skinner’s operant conditioning theory?

An application of B.F. Skinner’s operant conditioning theory is seen in education and classroom management. Teachers use positive reinforcement (rewards) to encourage good behavior and academic achievement, and negative reinforcement or punishment to discourage disruptive behavior.

For example, a student may earn extra recess time (positive reinforcement) for completing homework on time, or lose the privilege to use class computers (negative punishment) for misbehavior.

Further Reading

  • Ivan Pavlov Classical Conditioning Learning and behavior PowerPoint
  • Ayllon, T., & Michael, J. (1959). The psychiatric nurse as a behavioral engineer. Journal of the Experimental Analysis of Behavior, 2(4), 323-334.
  • Bandura, A. (1977). Social learning theory . Englewood Cliffs, NJ: Prentice Hall.
  • Dallery, J., Meredith, S., & Glenn, I. M. (2017). A deposit contract method to deliver abstinence reinforcement for cigarette smoking. Journal of Applied Behavior Analysis, 50 (2), 234–248.
  • Eckerstorfer, L., Tanzer, N. K., Vogrincic-Haselbacher, C., Kedia, G., Brohmer, H., Dinslaken, I., & Corbasson, R. (2019). Key elements of mHealth interventions to successfully increase physical activity: Meta-regression. JMIR mHealth and uHealth, 7 (11), e12100.
  • Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement . New York: Appleton-Century-Crofts.
  • Kohler, W. (1924). The mentality of apes. London: Routledge & Kegan Paul.
  • Kumar, D., Sinha, N., Dutta, A., & Lahiri, U. (2019). Virtual reality-based balance training system augmented with operant conditioning paradigm.  Biomedical Engineering Online ,  18 (1), 1-23.
  • Michie, S., Abraham, C., Whittington, C., McAteer, J., & Gupta, S. (2009). Effective techniques in healthy eating and physical activity interventions: A meta-regression. Health Psychology, 28 (6), 690–701.
  • Rosén, E., Westerlund, J., Rolseth, V., Johnson R. M., Viken Fusen, A., Årmann, E., Ommundsen, R., Lunde, L.-K., Ulleberg, P., Daae Zachrisson, H., & Jahnsen, H. (2018). Effects of QbTest-guided ADHD treatment: A randomized controlled trial. European Child & Adolescent Psychiatry, 27 (4), 447–459.
  • Skinner, B. F. (1948). ‘Superstition’in the pigeon.  Journal of experimental psychology ,  38 (2), 168.
  • Schunk, D. (2016).  Learning theories: An educational perspective . Pearson.
  • Skinner, B. F. (1938). The behavior of organisms: An experimental analysis . New York: Appleton-Century.
  • Skinner, B. F. (1948). Superstition” in the pigeon . Journal of Experimental Psychology, 38 , 168-172.
  • Skinner, B. F. (1951). How to teach animals . Freeman.
  • Skinner, B. F. (1953). Science and human behavior . Macmillan.
  • Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals. Psychological Monographs: General and Applied, 2(4), i-109.
  • Twohig, M. P., Whittal, M. L., Cox, J. M., & Gunter, R. (2010). An initial investigation into the processes of change in ACT, CT, and ERP for OCD. International Journal of Behavioral Consultation and Therapy, 6 (2), 67–83.
  • Watson, J. B. (1913). Psychology as the behaviorist views it . Psychological Review, 20 , 158–177.

Print Friendly, PDF & Email

Skinner’s theory on Operant Conditioning

bf skinner famous experiment

After the retirement of John B. Watson from the world of Academic psychology, psychologists and behaviorists were eager to propose new forms of learning other than the classical conditioning. The most important among these theories was  Operant Conditioning   proposed by Burrhus Frederic Skinner, commonly known as B.F. Skinner.

Burrhus Frederic Skinner

Skinner based his theory in the simple fact that the study of observable behavior is much simpler than trying to study internal mental events. Skinner’s works concluded a study far less extreme than those of Watson (1913), and it deemed  classical conditioning  as too simplistic of a theory to be a complete explanation of complex human behavior.

B.F. Skinner is famous for his pioneering research in the field of learning and behavior. He proposed the theory to study complex human behavior by studying the voluntary responses shown by an organism when placed in the certain environment. He named these behaviors or responses as operant. He is also called the father of Operant Conditioning Learning, but he based his theory known as “Law of Effect”, discovered by Edward Thorndike in 1905.

Operant Conditioning Learning

B.F. Skinner proposed his theory on operant conditioning by conducting various experiments on animals. He used a special box known as “Skinner Box” for his experiment on rats.

As the first step to his experiment, he placed a hungry rat inside the  Skinner box.  The rat was initially inactive inside the box, but gradually as it began to adapt to the environment of the box, it began to explore around. Eventually, the rat discovered a lever, upon pressing which; food was released inside the box. After it filled its hunger, it started exploring the box again, and after a while it pressed the lever for the second time as it grew hungry again. This phenomenon continued for the third, fourth and the fifth time, and after a while, the hungry rat immediately pressed the lever once it was placed in the box. Then the conditioning was deemed to be complete.

Here, the action of pressing the lever is an operant response/behavior, and the food released inside the chamber is the reward. The experiment is also known as  Instrumental Conditioning Learning  as the response is instrumental in getting food.

This experiment also deals with and explains the effects of positive reinforcement. Upon pressing the lever, the hungry rat was served with food, which filled its hunger; hence, it’s a positive reinforcement.

B F Skinner experiment

B.F. Skinner’s Second Experiment

B.F. Skinner also conducted an experiment that explained negative reinforcement. Skinner placed a rat in a chamber in the similar manner, but instead of keeping it hungry, he subjected the chamber to an unpleasant electric current. The rat having experienced the discomfort started to desperately move around the box and accidentally knocked the lever. Pressing of the lever immediately seized the flow of unpleasant current. After a few times, the rat had smartened enough to go directly to the lever in order to prevent itself from the discomfort.

The electric current reacted as the negative reinforcement, and the consequence of escaping the electric current made sure that the rat repeated the action again and again. Here too, the pressing of the lever is an operant response, and the complete stop of the electric current flow is its reward.

Both the experiment clearly explains the working of operant conditioning. The important part in any operant conditioning learning is to recognize the operant behavior and the consequence resulted in that particular environment.

What are other people reading?

a teacher teaching in the classroom

Department of Psychology

  • https://twitter.com/PsychHarvard
  • https://www.facebook.com/HarvardPsychology/
  • https://www.youtube.com/channel/UCFBv7eBJIQWCrdxPRhYft9Q
  • Participate

B. F. Skinner

B.F. Skinner (Image Source: Wikimedia Commons)

“To say that a reinforcement is contingent upon a response may mean nothing more than that it follows the response. It may follow because of some mechanical connection or because of the mediation of another organism; but conditioning takes place presumably because of the temporal relation only, expressed in terms of the order and proximity of response and reinforcement. Whenever we present a state of affairs which is known to be reinforcing at a given drive, we must suppose that conditioning takes place, even though we have paid no attention to the behavior of the organism in making the presentation.”

– B.F. Skinner, “Superstition’ in the Pigeon” (p. 168)

In the 20th century, many of the images that came to mind when thinking about experimental psychology were tied to the work of Burrhus Frederick Skinner.  The stereotype of a bespectacled experimenter in a white lab coat, engaged in shaping behavior through the operant conditioning of lab rats or pigeons in contraptions known as Skinner boxes comes directly from Skinner’s immeasurably influential research.

Although he originally intended to make a career as a writer, Skinner received his Ph.D. in psychology from Harvard in 1931, and stayed on as a researcher until 1936, when he departed to take academic posts at the University of Minnesota and Indiana University.  He returned to Harvard in 1948 as a professor, and was the Edgar Pierce Professor of Psychology from 1958 until he retired in 1974. 

Skinner was influenced by John B. Watson’s philosophy of psychology called behaviorism, which rejected not just the introspective method and the elaborate psychoanalytic theories of Freud and Jung, but any psychological explanation based on mental states or internal representations such as beliefs, desires, memories, and plans. The very idea of “mind” was dismissed as a pre-scientific superstition, not amenable to empirical investigation. Skinner argued that the goal of a science of psychology was to predict and control an organism’s behavior from its current stimulus situation and its history of reinforcement. In a utopian novel called Walden Two and a 1971 bestseller called Beyond Freedom and Dignity, he argued that human behavior was always controlled by its environment. According to Skinner, the future of humanity depended on abandoning the concepts of individual freedom and dignity and engineering the human environment so that behavior was controlled systematically and to desirable ends rather than haphazardly.

 In the laboratory, Skinner refined the concept of operant conditioning and the Law of Effect. Among his contributions were a systematic exploration of intermittent schedules of reinforcement, the shaping of novel behavior through successive approximations, the chaining of complex behavioral sequences via secondary (learned) reinforcers, and “superstitious” (accidentally reinforced) behavior.

Skinner was also an inveterate inventor. Among his gadgets were the “Skinner box” for shaping and counting lever-pressing in rats and key-pecking in pigeons; the cumulative recorder, a mechanism for recording rates of behavior as a pen tracing; a World-War II-era missile guidance system (never deployed) in which a trained pigeon in the missile’s transparent nose cone continually pecked at the target; and “teaching machines” for “programmed learning,” in which students were presented a sentence at a time and then filled in the blank in a similar sentence, shown in a small window. He achieved notoriety for a mid-1950s Life magazine article showcasing his “air crib,” a temperature-controlled glass box in which his infant daughter would play. This led to the urban legend, occasionally heard to this day, that Skinner “experimented on his daughter” or “raised her in a box” and that she grew up embittered and maladjusted, all of which are false.

B.F. Skinner was ranked by the American Psychological Association as the 20th century’s most eminent psychologist.

B. F. Skinner. (1998).  Public Broadcasting Service.  Retrieved December 12, 2007, from:  http://www.pbs.org/wgbh/aso/databank/entries/bhskin.html

Eminent psychologists of the 20th century.  (July/August, 2002). Monitor on Psychology, 33(7), p.29.

Skinner, B. F. (1947).  ‘Superstition’ in the pigeon. Journal of Experimental Psychology, 38, 168-172.

Skinner, B. F. (1959) Cumulative record. New York: Appleton Century Crofts.

Bjork, D. W. (1991). Burrhus Frederick Skinner: The contingencies of a life.  In: Kimble, G. A. & Wertheimer, M. [Eds.]  Portraits of Pioneers in Psychology. 

Role/Affiliation

Filter: role.

  • Faculty (26)
  • Affiliated Faculty (5)
  • Non-Ladder Faculty (19)
  • Visiting Scholars (4)
  • Fellows and Associates (72)
  • Graduate Students (70)
  • Historical Faculty (24)
  • Postdocs and Research Associates (60)
  • Professors Emeriti (4)

Filter: Research Program

  • Clinical Science (6)
  • Cognition, Brain, & Behavior (17)
  • Developmental Psychology (7)
  • Social Psychology (12)

Filter by alphabetical grouping of Last Name

Explore Psychology

B. F. Skinner: Biography and Theories

Categories History

B. F. Skinner was born on March 20, 1904. He went on to become an influential psychologist who first described the learning process known as operant conditioning . Skinner played a pivotal role in behaviorism , a school of thought that suggested that all behavior was learned through conditioning processes. 

Skinner referred to himself as a radical behaviorist because he believed that psychology should focus only on the study of observable, overt behavior . 

In a survey of psychologists, B. F. Skinner was identified as the most influential psychologist on a list of the most influential psychologists of the 20th century.

This article takes a closer look at his life, his work, and the powerful impact he had on psychology and our understanding of how people learn.

Table of Contents

B. F. Skinner’s Life

Born Burrhus Frederic Skinner, he grew up in a small, rural Pennsylvania town as one of two children. He enjoyed building things as a boy and started to develop a strong interest in science when he was in high school.

Despite this interest, he went on to earn a degree in English literature, graduating from Hamilton College in 1926. He originally set out to become a novelist, but soon grew disillusioned with his prospects as a writer.

After discovering the works of Ivan Pavlov and John B. Watson, two thinkers important in the discovery and advancement of behaviorism, Skinner enrolled at Harvard University to study psychology. He worked at the university after graduating with his Ph.D. in 1931. It was during this period that he began working to develop his theory of operant conditioning. 

B. F. Skinner’s Research

In order to study human behavior in a systematic, scientific manner, Skinner put his building and inventing skills to work to create tools to conduct his research. His creations included:

  • The Skinner box: This was a chamber in which an animal subject could press a key in order to receive some type of reward.
  • The cumulative recorder: This tool recorded the animal’s response rate on a piece of paper as a sloping line. 

As Skinner used these devices in his research, he observed something interesting. Where classical conditioning insisted that a response depended on the stimulus that came before it, Skinner realized that behavior could also hinge on the events that come after. 

In other words, the consequences of the behavior affect how well and how quickly that behavior is learned.

After leaving Harvard to take a position at the University of Minnesota, Skinner began working on a project to support the war effort. His goal was to teach pigeons to guide missiles. The development of missile radar meant that Skinner’s project fell to the wayside. But his work contributed to his continued investigation into operant conditioning. 

During this time, Skinner applied his skills to developing a crib/playpen of sorts that he dubbed “the baby tender.” Meant to serve as a safer alternative to the available cribs of the time, it became the subject of psychology lore when an urban legend suggested it was used in experiments involving Skinner’s own children. 

Skinner’s Operant Conditioning

Skinner’s theory of operant conditioning was based on Edward Thorndike’s law of effect .

The law of effect states that behaviors followed by desirable or enjoyable consequences are more likely to occur again, while behaviors followed by negative consequences are less likely to occur again. 

Based on his experiments with animals, Skinner described two important concepts that can influence learning and behavior:

Reinforcement

Reinforcement is anything that increases or strengthens a behavior. Reinforcements can either involve the addition of something (known as a positive reinforcer) or the removal of something (known as a negative reinforcer).

For example, giving a child a cookie for cleaning their room is an example of positive reinforcement . Canceling a test if a student does all of their homework on time is an example of negative reinforcement .

Punishment is anything that decreases or weakens a behavior. Punishments can involve adding something (positive punishment) or removing something (negative punishment).

For example, assigning a child chores for not doing their homework is an example of positive punishment. Taking away their iPad for hitting their sibling is an example of negative punishment.

Reinforcement Schedules

Skinner also discovered that the timing and frequency of reinforcements played a role in how behaviors are elicited. For example, some schedules of reinforcement involve delivering the reinforcement every time a behavior occurs. In other cases, it involves delivering the reinforcement after a set period of time or after a number of responses occur.

What Impact Did B. F. Skinner Have?

Skinner’s work made him one of the most influential figures in the field of psychology. His ideas had a powerful impact within psychology as well as in other fields including therapy and education. 

He was diagnosed with leukemia in 1989. In 1990,  the American Psychological Association (APA) awarded him with their lifetime achievement award. He passed away eight days after accepting the award on August 18, 1990.

During his storied career in psychology, he published more than 20 books and close to 200 articles. For much of the 20th century, he was one of the figures most associated with the field of psychology, and his work continues to have an impact today.

Behaviorism is no longer the force it once was, but mental health professionals, therapists, parents, and many others continue to use the principles of operant conditioning to teach and change behaviors in order to help people learn, adapt, and grow. 

Summary 

B. F. Skinner was an advocate for behaviorism and believed that psychology should be the science of observable behavior . His work contributed to our understanding of operant conditioning and how reinforcement and punishment can be used to teach and modify behaviors.

Bjork DW. B.F. Skinner: A Life . Washington, D.C.: American Psychological Association; 1997.

Haggbloom SJ. The 100 most eminent psychologists of the twentieth century. PsycEXTRA Dataset . 2001. doi:10.1037/e413802005-787

The B.F. Skinner Foundation. Biographical Information .

Skinner BF. The Behavior of Organisms: An Experimental Analysis. New York: Appleton-Century; 1938.

Psychologist World

Learn More Psychology

  • Behavioral Psychology

Operant Conditioning

A look at operant conditioning as a process of learning, and how skinner's box experiments demonstrated the effect of reinforcements on behavior..

Permalink Print   |  

Operant Conditioning

  • Biological Psychology
  • Body Language Interpretation
  • Cognitive Psychology
  • Developmental Psychology
  • Dream Interpretation
  • Freudian Psychology
  • Memory & Memory Techniques
  • Role Playing: Stanford Prison Experiment
  • Authoritarian Personality
  • Memory: Levels of Processing
  • Cold Reading: Psychology of Fortune Telling
  • Stages of Sleep
  • Personality Psychology
  • Why Do We Forget?
  • Psychology of Influence
  • Stress in Psychology
  • Body Language: How to Spot a Liar
  • Be a Better Communicator
  • Eye Reading: Body Language
  • Motivation: Maslow's Hierarchy of Needs
  • How to Interpret your Dreams Guide
  • How to Remember Your Dreams
  • Interpreting Your Dreams
  • Superstition in Pigeons
  • Altruism in Animals and Humans
  • Stimulus-Response Theory
  • Conditioned Behavior
  • Synesthesia: Mixing the Senses
  • Freudian Personality Type Test
  • ... and much more
  • Unlimited access to analysis of groundbreaking research and studies
  • 17+ psychology guides : develop your understanding of the mind
  • Self Help Audio : MP3 sessions to stream or download

Best Digital Psychology Magazine - UK

Best online psychology theory resource.

  • Pavlov, I. P. (1927). Conditioned Reflexes: An investigation of the physiological activity of the cerebral cortex. Retrieved from http://psychclassics.yorku.ca/Pavlov/ .
  • Thorndike, E. L. (1898). Review of Animal Intelligence: An Experimental Study of the Associate Processes in Animals. Psychological Review , 5 (5), 551-553.
  • Skinner, B. F. (1948). 'Superstition' in the Pigeon. Journal of Experimental Psychology , 38 , 168-172.
  • Ferster, C. B. and Skinner, B. F. (1957). Schedules of Reinforcement .

Which Archetype Are You?

Which Archetype Are You?

Are You Angry?

Are You Angry?

Windows to the Soul

Windows to the Soul

Are You Stressed?

Are You Stressed?

Attachment & Relationships

Attachment & Relationships

Memory Like A Goldfish?

Memory Like A Goldfish?

31 Defense Mechanisms

31 Defense Mechanisms

Slave To Your Role?

Slave To Your Role?

Which Archetype Are You?

Are You Fixated?

Are You Fixated?

Interpret Your Dreams

Interpret Your Dreams

How to Read Body Language

How to Read Body Language

How to Beat Stress and Succeed in Exams

bf skinner famous experiment

Psychology Topics

Learn psychology.

Sign Up

  • Access 2,200+ insightful pages of psychology explanations & theories
  • Insights into the way we think and behave
  • Body Language & Dream Interpretation guides
  • Self hypnosis MP3 downloads and more
  • Behavioral Approach
  • Eye Reading
  • Stress Test
  • Cognitive Approach
  • Fight-or-Flight Response
  • Neuroticism Test

© 2024 Psychologist World. Home About Contact Us Terms of Use Privacy & Cookies Hypnosis Scripts Sign Up

CogniFit Blog: Brain Health News

CogniFit Blog: Brain Health News

Brain Training, Mental Health, and Wellness

bf skinner famous experiment

Operant Conditioning – 4 Interesting Experiments by B.F. Skinner

' src=

Operant conditioning might sound like something out of a dystopian novel. But it’s not. It’s a very real thing that was forged by a brilliant, yet quirky, psychologist. Today, we will take a quick look at his work as we as a few odd experiments that went with it…

There are few names in psychology more well-known than B. F. Skinner. First-year psychology students scribble endless lecture notes on him. Doctoral candidates cite his work in their dissertations as they test whether a rat’s behavior can be used to predict behavior in humans.

Skinner is one of the most well-known psychologists of our time that was famous for his experiments on operant conditioning. But how did he become such a central figure of these Intro to Psych courses? And, how did he develop his theories and methodologies cited by those sleep-deprived Ph.D. students?

THE FATHER OF OPERANT CONDITIONING

Skinner spent his life studying the way we behave and act. But, more importantly, how this behavior can be modified.

He viewed Ivan Pavlov’s classical model of behavioral conditioning as being “too simplistic a solution” to fully explain the complexities of human (and animal) behavior and learning. It was because of this, that Skinner started to look for a better way to explain why we do things.

His early work was based on Edward Thorndike’s 1989 Law of Effect . Skinner went on to expand on the idea that most of our behavior is directly related to the consequences of said behavior. His expanded model of behavioral learning would be called operant conditioning. This centered around two things…

  • The concepts of behaviors – the actions an organism or test subject exhibits
  • The operants – the environmental response/consequences directly following the behavior

But, it’s important to note that the term “consequences” can be misleading. This is because there doesn’t need to be a causal relationship between the behavior and the operant. Skinner broke these responses down into three parts.

1. REINFORCERS – These give the organism a desirable stimulus and serve to increase the frequency of the behavior.

2. PUNISHERS – These are environmental responses that present an undesirable stimulus and serve to reduce the frequency of the behavior.

3. NEUTRAL OPERANTS – As the name suggests, these present stimuli that neither increase nor decrease the tested behavior.

Throughout his long and storied career, Skinner performed a number of strange experiments trying to test the limits of how punishment and reinforcement affect behavior.

4 INTERESTING OPERANT EXPERIMENTS

Though Skinner was a professional through and through, he was also quite a quirky person. And, his unique ways of thinking are very clear in the strange and interesting experiments he performed while researching the properties of operant conditioning.

Experiment #1: The Operant Conditioning Chamber

The Operant Conditioning Chamber, better known as the Skinner Box , is a device that B.F. Skinner used in many of his experiments. At its most basic, the Skinner Box is a chamber where a test subject, such as a rat or a pigeon, must ‘learn’ the desired behavior through trial and error.

B.F. Skinner used this device for several different experiments. One such experiment involves placing a hungry rat into a chamber with a lever and a slot where food is dispensed when the lever is pressed. Another variation involves placing a rat into an enclosure that is wired with a slight electric current on the floor. When the current is turned on, the rat must turn a wheel in order to turn off the current.  

Though this is the most basic experiment in operant conditioning research, there is an infinite number of variations that can be created based on this simple idea.

Experiment #2: A Pigeon That Can Read

Building on the basic ideas from his work with the Operant Conditioning Chamber, B. F. Skinner eventually began designing more and more complex experiments.

One of these experiments involved teaching a pigeon to read words presented to it in order to receive food. Skinner began by teaching the pigeon a simple task, namely, pecking a colored disk, in order to receive a reward. He then began adding additional environmental cues (in this case, they were words), which were paired with a specific behavior that was required in order to receive the reward.

Through this evolving process, Skinner was able to teach the pigeon to ‘read’ and respond to several unique commands.

Though the pigeon can’t actually read English, the fact that he was able to teach a bird multiple behaviors, each one linked to a specific stimulus, by using operant conditioning shows us that this form of behavioral learning can be a powerful tool for teaching both animals and humans complex behaviors based on environmental cues.

Experiment #3: Pigeon Ping-Pong

But Skinner wasn’t only concerned with teaching pigeons how to read. It seems he also made sure they had time to play games as well. In one of his more whimsical experiments , B. F. Skinner taught a pair of common pigeons how to play a simplified version of table tennis.

The pigeons in this experiment were placed on either side of a box and were taught to peck the ball to the other bird’s side. If a pigeon was able to peck the ball across the table and past their opponent, they were rewarded with a small amount of food. This reward served to reinforce the behavior of pecking the ball past their opponent.

Though this may seem like a silly task to teach a bird, the ping-pong experiment shows that operant conditioning can be used not only for a specific, robot-like action but also to teach dynamic, goal-based behaviors.

Experiment #4: Pigeon-Guided Missiles

Thought pigeons playing ping-pong was as strange as things could get? Skinner pushed the envelope even further with his work on pigeon-guided missiles.

While this may sound like the crazy experiment of a deluded mad scientist, B. F. Skinner did actually do work to train pigeons to control the flight paths of missiles for the U.S. Army during the second world war.

Skinner began by training the pigeons to peck at shapes on a screen. Once the pigeons reliably tracked these shapes, Skinner was able to use sensors to track whether the pigeon’s beak was in the center of the screen, to one side or the other, or towards the top or bottom of the screen. Based on the relative location of the pigeon’s beak, the tracking system could direct the missile towards the target location.

Though the system was never used in the field due in part to advances in other scientific areas, it highlights the unique applications that can be created using operant training for animal behaviors.

THE CONTINUED IMPACT OF OPERANT CONDITIONING

B. F. Skinner is one of the most recognizable names in modern psychology, and with good reason. Though many of his experiments seem outlandish, the science behind them continues to impact us in ways we rarely think about.

The most prominent example is in the way we train animals for tasks such as search and rescue, companion services for the blind and disabled, and even how we train our furry friends at home—but the benefits of his research go far beyond teaching Fido how to roll over.

Operant conditioning research has found its way into the way schools motivate and discipline students, how prisons rehabilitate inmates, and even in how governments handle geopolitical relationships .

  • Category: Brain Health & Neuroscience

bf skinner famous experiment

Pin It on Pinterest

Share this post with your friends!

B.F. Skinner

B.F. Skinner

(1904-1990)

Who Was B.F. Skinner?

Psychologist B.F. Skinner began working on ideas of human behavior after earning his doctorate from Harvard. Skinner's works include The Behavior of Organisms (1938) and a novel based on his theories Walden Two (1948). He explored behaviorism in relation to society in later books, including Beyond Freedom and Human Dignity (1971).

Burrhus Frederic Skinner was born on March 20, 1904, in the small town of Susquehanna, Pennsylvania, where he also grew up. His father was a lawyer and his mother stayed home to care for Skinner and his younger brother. At an early age, Skinner showed an interest in building different gadgets and contraptions.

As a student at Hamilton College, B.F. Skinner developed a passion for writing. He tried to become a professional writer after graduating in 1926, but with little success. Two years later, Skinner decided to pursue a new direction for his life. He enrolled at Harvard University to study psychology.

The Skinner Box

At Harvard, B.F. Skinner looked for a more objective and measured way to study behavior. He developed what he called an operant conditioning apparatus to do this, which became better known as the Skinner box. With this device, Skinner could study an animal interacting with its environment. He first studied rats in his experiments, seeing how the rodents discovered and used to a level in the box, which dispensed food at varying intervals.

Later, Skinner examined what behavior patterns developed in pigeons using the box. The pigeons pecked at a disc to gain access to food. From these studies, Skinner came to the conclusion that some form of reinforcement was crucial in learning new behaviors.

After finishing his doctorate degree and working as a researcher at Harvard, Skinner published the results of his operant conditioning experiments in The Behavior of Organisms (1938). His work drew comparisons to Ivan Pavlov, but Skinner's work involved learned responses to an environment rather than involuntary responses to stimuli.

While teaching at the University of Minnesota, Skinner tried to train pigeons to serve as guides for bombing runs during World War II. This project was canceled, but he was able to teach them how to play ping pong. Skinner turned to a more domestic endeavor during the war. In 1943, he built a new type of crib for his second daughter Deborah at his wife's request. The couple already had a daughter named Julie. This clear box, called the "baby tender," was heated so that the baby didn't need blankets. There were no slats in the sides either, which also prevented possible injury.

In 1945, Skinner became the chair of the psychology department at Indiana University. But he left two years later to return to Harvard as a lecturer. Skinner received a professorship there in 1948 where he remained for the rest of his career. As his children grew, he became interested in education. Skinner developed a teaching machine to study learning in children. He later wrote The Technology of Teaching (1968).

Skinner presented a fictional interpretation of some of his views in the 1948 novel Walden Two , which proposed a type of utopian society. The people in the society were led to be good citizens through behavior modification—a system of rewards and punishments. The novel seemed to undermine Skinner's credibility with some of his academic colleagues. Others questioned his focus on scientific approaches to the exclusion of less tangible aspects of human existence.

In the late 1960s and early '70s, Skinner wrote several works applying his behavioral theories to society, including Beyond Freedom and Dignity (1971). He drew fire for seemingly implying that humans had no free will or individual consciousness. Noam Chomsky was among Skinner's critics. In 1974, Skinner tried to set the record straight regarding any misinterpretations of his work with About Behaviorism .

Final Years and Death

In his later years, Skinner took to chronicling his life and research in a series of autobiographies. He also continued to be active in the field of behavioral psychology — a field he helped popularize. In 1989, Skinner was diagnosed with leukemia. He succumbed to the disease the following year, dying at his home in Cambridge, Massachusetts, on August 18, 1990.

Skinner's identification of the importance of reinforcement remains a critical discovery. He believed that positive reinforcement was a great tool for shaping behavior, an idea still valued in numerous settings including schools today. Skinner's beliefs are still being promoted by the B.F. Skinner Foundation, which is headed by his daughter, Julie S. Vargas.

QUICK FACTS

  • Name: B. F. Skinner
  • Birth Year: 1904
  • Birth date: March 20, 1904
  • Birth State: Pennsylvania
  • Birth City: Susquehanna
  • Birth Country: United States
  • Gender: Male
  • Best Known For: American psychologist B.F. Skinner is best known for developing the theory of behaviorism, and for his utopian novel 'Walden Two.'
  • Education and Academia
  • Science and Medicine
  • Writing and Publishing
  • Astrological Sign: Pisces
  • Hamilton College
  • Harvard University
  • Death Year: 1990
  • Death date: August 18, 1990
  • Death State: Massachusetts
  • Death City: Cambridge
  • Death Country: United States

CITATION INFORMATION

  • Article Title: B.F. Skinner Biography
  • Author: Biography.com Editors
  • Website Name: The Biography.com website
  • Url: https://www.biography.com/scientists/bf-skinner
  • Access Date:
  • Publisher: A&E; Television Networks
  • Last Updated: October 27, 2021
  • Original Published Date: April 2, 2014
  • Education is what survives when what has been learned has been forgotten.
  • Give me a child and I'll shape him into anything.

Watch Next .css-16toot1:after{background-color:#262626;color:#fff;margin-left:1.8rem;margin-top:1.25rem;width:1.5rem;height:0.063rem;content:'';display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;}

preview for Biography Scientists & Inventors Playlist

Famous Scientists

antique compass

Stephen Hawking

chien shiung wu

Chien-Shiung Wu

albert einstein sitting in front of a bookcase with his arms folded

The Solar Eclipse That Made Albert Einstein a Star

jane goodall

Jane Goodall

marie curie

Marie Curie

a couple of men working on a sarcophagus of king tut

Howard Carter, King Tut's Tomb, and a Deadly Curse

black and white sketch of benjamin banneker

Benjamin Banneker

neil degrasse tyson

Neil deGrasse Tyson

daniel hale williams

Daniel Hale Williams

patricia bath smiles at the camera, she stands in front of a black background with white logos and wears a gray suit jacket with an orange, red, and black scarf, he holds one hand across her chest

Patricia Bath

mae jemison smiles at the camera while standing in front of a photo background with designs and writing, she wears a red top with gold hoop earrings a gold necklace

Mae Jemison

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center

B.F. Skinner

  • When did American literature begin?
  • Who are some important authors of American literature?
  • What are the periods of American literature?

Close up of books. Stack of books, pile of books, literature, reading. Homepage 2010, arts and entertainment, history and society

B.F. Skinner

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Iowa State University Digital Press - Parenting and Family Diversity Issues - 1940s: Skinner
  • Pennsylvania Center for the Book - B. F. Skinner
  • Famous Scientists - Biography of B. F. Skinner
  • Shippensburg University - B. F. Skinner
  • National Center for Biotechnology Information - PubMed Central - B. F. Skinner: The Writer and His Definition of Verbal Behavior
  • National Academy of Sciences - Burrhus Frederic Skinner
  • Officila Site of B. F. Skinner Foundation
  • B.F. Skinner - Student Encyclopedia (Ages 11 and up)

B.F. Skinner

B.F. Skinner (born March 20, 1904, Susquehanna, Pennsylvania , U.S.—died August 18, 1990, Cambridge , Massachusetts) was an American psychologist and an influential exponent of behaviourism , which views human behaviour in terms of responses to environmental stimuli and favours the controlled, scientific study of responses as the most direct means of elucidating human nature .

Skinner was attracted to psychology through the work of the Russian physiologist Ivan Petrovich Pavlov on conditioned reflexes, articles on behaviourism by Bertrand Russell , and the ideas of John B. Watson , the founder of behaviourism. After receiving his Ph.D. from Harvard University (1931), he remained there as a researcher until 1936, when he joined the faculty of the University of Minnesota , Minneapolis, where he wrote The Behavior of Organisms (1938).

As professor of psychology at Indiana University , Bloomington (1945–48), Skinner gained some measure of public attention through his invention of the Air Crib baby tender—a large, soundproof, germ-free, mechanical, air-conditioned box designed to provide an optimal environment for child growth during the first two years of life. In 1948 he published one of his most controversial works, Walden Two , a novel on life in a utopian community modeled on his own principles of social engineering.

bf skinner famous experiment

As a professor of psychology at Harvard University from 1948 (emeritus 1974), Skinner influenced a generation of psychologists. Using various kinds of experimental equipment that he devised, he trained laboratory animals to perform complex and sometimes quite exceptional actions. A striking example was his pigeons that learned to play table tennis . One of his best-known inventions, the Skinner box , has been adopted in pharmaceutical research for observing how drugs may modify animal behaviour .

His experiences in the step-by-step training of research animals led Skinner to formulate the principles of programmed learning , which he envisioned to be accomplished through the use of so-called teaching machines . Central to his approach is the concept of reinforcement, or reward. The student, learning by use of the machine at his own pace, is rewarded for responding correctly to questions about the material he is trying to master. Learning is thereby presumably reinforced and facilitated .

In addition to his widely read Science and Human Behavior (1953), Skinner wrote many other books, including Verbal Behavior (1957), The Analysis of Behavior (with J.G. Holland, 1961), and Technology of Teaching (1968). Another work that generated considerable controversy, Beyond Freedom and Dignity (1971), argued that concepts of freedom and dignity may lead to self-destruction and advanced the cause of a technology of behaviour comparable to that of the physical and biological sciences. Skinner published an autobiography in three parts: Particulars of My Life (1976), The Shaping of a Behaviorist (1979), and A Matter of Consequences (1983). The year before his death, Recent Issues in the Analysis of Behavior (1989) was published.

Famous Scientists

B. F. Skinner

B.F. Skinner

B.F. Skinner was the 20th century’s most influential psychologist, pioneering the science of behaviorism. Inventor of the Skinner Box, he discovered the power of positive reinforcement in learning, and he designed the first psychological experiments to give quantitatively repeatable and predictable results.

Skinner wrote a bestselling novel for the general public exploring the effects of utilizing his behaviorism principles in a community of people.

He courted controversy by denying the existence of free will, claiming that humans act according to rules programmed by a combination of our genes and the circumstances of our environment.

Burrhus Frederic Skinner was born in the small railroad town of Susquehanna, Pennsylvania, USA on March 20, 1904. He was always known to his family as Fred. His mother nearly died in childbirth.

Fred’s father, William Arthur Skinner, was an attorney. His mother, Grace Madge Burrhus, was a typist who gave up her job when she married; she was also an accomplished pianist and singer. Both parents graduated second in their classes from high school. Fred had one younger brother, Edward.

Country Boys

Fred and Edward enjoyed a rural upbringing. The two brothers harvested grapes, apples, and plums from their home’s overgrown garden. In the forest they collected berries and nuts; they set traps for mice and chipmunks, which they released, and snakes, which they killed. They gathered cocoons so they could watch moths and butterflies emerge later in the season. They watched the blacksmith and carpenter doing their work and watched trains passing through town.

Burning in the Fires of Hell

Fred’s parents were not very religious – they often skipped their Presbyterian church on Sundays.

bf skinner famous experiment

Later, his father confirmed to him that bad boys ended up in the fires of hell. For years Fred would lie awake at night sobbing over a lie he had once told, unable to reveal even to his mother what was upsetting him so much.

Fred was not baptized. His father believed children should only be baptized when they understood what was happening, and for Fred that didn’t happen.

Although his confirmation that hell was real unintentionally traumatized Fred, Fred’s father never physically punished his sons. This was unusual for the era. Later in life, Fred’s behaviorist psychology would focus on rewards rather than punishments.

Toys and Books

Fred grew up with toys that stimulated his mind. He bolted together devices from Meccano, and he experimented with a working miniature steam engine. When he was a little older, he carried out his own chemistry and electrical experiments. He designed but did not build a perpetual motion machine. He built a steam cannon.

Fred spent a lot of time working on mathematical and word problems in a big puzzle book. His favorite books were Daniel Defoe’s Robinson Crusoe and Jules Verne’s Mysterious Island – he liked the idea of self-sufficiency. He wrote poems and short stories on his father’s typewriter.

As part of their education, their father took Fred and his brother to see factories and find out how things were produced.

Beating Teacher

His 12 years of schooling all took place in one small building. In eighth grade, he had trouble with a teacher. Fred argued with her in class, telling her that what she was saying was factually incorrect. His teacher complained to the principal, and the principal asked her about the specific arguments she was having with Fred. The principal realized Fred was correct, so Fred was not punished.

Francis Bacon

In eighth grade, Fred became a devotee of the works of William Shakespeare and also bought into the conspiracy theory that the true author of Shakespeare’s works was Francis Bacon . He read further works by Bacon, including his famous Novum Organum describing the Baconian Method of doing science.

Graduating from High School

Fred enjoyed his time at Susquehanna High School, where mathematics was his favorite subject. He graduated second in his class, just like his parents at the very same school.

The school’s principal, ‘Professor’ Bowles liked Fred, but near the end of Fred’s schooling grew worried that Fred had become an agnostic or even an atheist. He tried to guide Fred by giving him a book entitled God or Gorilla , which attempted to refute the theory of evolution; Fred was not guided by the book.

Hamilton College & Tragedy

Fred Skinner

Fred Skinner. Hamilton College freshman.

In the Easter vacation, Fred’s younger brother Edward fell violently ill. Fred called the doctor, who told him to go fetch his parents from church. By the time they returned, Edward had died. The cause was determined to be ‘acute indigestion.’ Later in life, Fred reasoned that the true cause had been a massive cerebral hemorrhage. Edward’s death at age 16 left the family, including Fred, in shock for a long time.

At the end of his freshman year, Fred wrote an essay about the year. He had not enjoyed it. Too many of his fellow students looked down on scholarly students like Fred, who wanted to learn about literature rather than engage in athletics. This had drained his enthusiasm. His disdain for his fellow students was matched by theirs for him: he was thought of as a somewhat vain, standoffish, highbrow character.

Fred enjoyed his sophomore year and by junior year had decided to major in English Language and Literature. He graduated with a Bachelor of Arts in 1926, intent on becoming a novelist.

Failed Novelist Turns to Psychology

By the end of summer 1926, Fred realized he could not write a novel and began considering an alternative future. He realized novel-writing was attractive to him because it involved describing and analyzing human behavior. In November 1927, he decided to abandon literature and study psychology. He became a graduate student at Harvard University’s psychology department in the fall of 1928, age 24. Three years later he graduated with a Ph.D. in psychology.

B.F. Skinner’s Life in Science

Skinner carried out research at Harvard until 1936 then lectured at the University of Minnesota. He became chair of Indiana University’s psychology department in 1946. In 1948, he returned to Harvard as a professor and spent the rest of his career there.

The Skinner Box

The many hours Skinner spent as a boy building Meccano devices, steam cannons, and the like paid dividends in his graduate student work at Harvard. His ability to improvise and design innovative new gadgets allowed him to approach the study of animal behavior in unique ways.

His personal behavior was highly regulated. He rose every day at six, studied, ate breakfast, attended classes, worked in the laboratory, studied until nine, then went to bed. He never left more than 15 minutes of free time in his daily schedule.

He demanded regularity from his experiments too. He criticized existing published research for its lack of repeatability. Through trial, error, happy accident, and his invention of gadgets he produced the Skinner Box. For the first time ever, this apparatus produced measurable, repeatable, quantitatively predictable behavior from rats.

Skinner Box

A basic Skinner Box. The rat operates a lever to dispense food. The operation of the lever and the amount of food dispensed are recorded.

Since its invention, the Skinner Box has allowed experiments to be carried out that have greatly increased our understanding of animal behavior.

On the day he celebrated his 26th birthday he wrote his parents excitedly about the ‘Skinner Box’ breakthrough: “the greatest birthday present I got.”

Enormous Progress

B.F. Skinner

Using his newly invented box, Skinner made probably his greatest discoveries. Until then, behavior scientists had studied rats using mazes. Skinner’s results were better controlled, producing repeatable, predictable results.

He proved that hungry rats equally deprived of food (for 24 hours) all pressed a lever that released pellets of food at the same rate. A recorder Skinner designed and built himself showed the rats all ingested food at the same rate too.

He could now quantifiably predict the behavior of rats that had not eaten for 24 hours. Nobody before had ever been able to quantitatively predict as clearly how an animal would behave.

The Greatest Discovery

Skinner’s greatest discovery was of “immediate reinforcement” or “instant conditioning.”

Rats experienced a positive consequence when they pushed the lever in the Skinner Box – they got food. This produced a new behavior pattern in rats – pressing a lever, which emerged accidentally as a result of the rats’ own qualities combined with positive reinforcement.

Skinner only liked to use reinforcement techniques. He did not advocate punishment for people or other animals, believing it produced avoidance behavior that could have worse consequences than the behavior being punished.

Today, reinforcement techniques are used in human education and animal training.

Clicker Training

Clicker Training

Reinforcement and punishment are the basis of ‘operant conditioning’ – conditioning takes place when the strength of a particular behavior is modified by the consequences of that behavior. Skinner is regarded as the founder of scientific operant conditioning. In fact, practical aspects of operant conditioning are likely to have been in use for almost as long as humans have existed.

Skinner never used the term ‘Skinner Box’ himself. He disapproved of it, preferring ‘lever box’ or ‘operant conditioning chamber.’ He described his discoveries in this field in his 1938 book Behavior of Organisms .

The Pigeon-Guided Missile

Skinner moved away from working with rats to pigeons, which are longer-lived.

kamikaze pigeon

Skinner demonstrated that trained pigeons housed in compartments within a missile could guide it on to a target.

Lenses projected an image of the target on a screen which the pigeons pecked, keeping the missile on track to hit the target. Although it might sound bizarre, Skinner’s invention actually worked. In the end, however, the military decided to utilize radar and other technologies to guide their missiles rather than Skinner’s kamikaze pigeons.

How to Educate People

Skinner criticized the education system for relying too heavily on punishments and not enough on positive reinforcement. He gave the example of a sixth grade teacher whose pupils misbehaved in class, didn’t work in class, and didn’t do their homework.

The teacher read about positive reinforcement and decided to show the class a prize one of them would get at the end of the week. The winning pupil’s name would be drawn in a lottery. Every time a pupil completed their homework or completed their class work satisfactorily, the pupil was allowed to write his or her name on a slip of paper to go into the name pool, increasing his or her chances of winning.

The children began listening to their teacher more attentively, because what she was teaching them helped them do their work. They also completed their classwork and homework. Life for that teacher became much easier. She could give more relaxed lessons and her pupils learned more. Skinner said that with more training in behaviorism, the teacher could have designed a more subtle system, but even the crude system showed how effective behaviorism could be in practice.

Some practitioners of positive reinforcement found that after children changed their behaviors and started learning in the classroom, they enjoyed it so much that learning new things became the reward – no other inducements were needed.

One Step at a Time

Skinner taught a rat to spend money in the form of glass marbles. The rat:

  • pulled a chain to get a marble
  • picked up the marble
  • carried the marble to a tube
  • dropped the marble in the tube
  • was rewarded with food

The rat learned the process in small steps, with rewards providing positive reinforcement when each step was completed correctly.

Skinner applied the same principles to human learning. He showed how people with a handicap could learn to walk by breaking the process into small steps with positive reinforcement. He produced a machine that taught students with learning difficulties to read using a process of successive approximation. Positive reinforcement when the students got steps right helped them make fast progress. Skinner’s mechanical teaching machines never became popular, although computer software used in teaching today makes use of Skinner’s behaviorism.

Child Cruelty and Other Myths

The behavioralist A. Charles Catania wrote in 1984:

Of all contemporary psychologists, B.F. Skinner is perhaps the most honored and the most maligned, the most widely recognized and the most misrepresented, the most cited and the most misunderstood.

Skinner's wife Yvonne, with daughter Deborah.

Skinner’s wife Yvonne, with daughter Deborah.

The truth was that Skinner, who loved inventing gadgets, had designed a warm, enclosed, temperature-controlled bed/playpen for his daughter to sleep and play in without the need for clothes or blankets – it may not have been how everyone would raise a baby, but it wasn’t what the rumor-mongers suggested.

In the 1940s, Skinner revived his original plan to become a novelist, publishing Walden Two in 1948, the year he returned to Harvard. Eventually, over a million copies of Walden Two sold.

In Walden Two Skinner explores a fictional community, called Walden Two, in which his behavioral techniques are employed, creating (in Skinner’s opinion) a better society.

The characters who inhabit Walden Two are portrayed as happier than those who don’t. In the community there is no wealth, and no poverty, and no crime. Consumption is minimalized rather than encouraged. Lower consumption means members of the society need to work only four hours a day, spending the rest of their time enjoying a wide range of recreational activities.

The idea of free will is rejected as having no scientific basis. In place of democracy, the community is governed by a group of self-appointed planners who govern using scientific behavioral principles. Children are reared in accordance with these behavioral principles by the community rather than within their own families.

There are no punishments, there is re-education using positive reinforcement. Skinner believed Walden Two’s community was an admirable way to deal with the challenges faced by American society in the 1940s and he continued to believe this in later life. Critics accused him of showing totalitarian tendencies.

Beyond Freedom and Dignity

In an attempt to bring his ideas to the general public, Skinner wrote Beyond Freedom and Dignity , published in 1971. Again he denied the existence of free will, maintaining that our behavior is wholly controlled by our genes, our environment, and social interactions.

Skinner said humans could behave more altruistically if scientific methods were utilized to modify our behavior.

He claimed that threats to human existence, such as nuclear war, ecological destruction, and overpopulation could be dealt with effectively if we abandoned the misguided concept of free will and embraced social engineering based on his behaviorism. The book was a controversial best seller. Time Magazine featured Skinner on its front cover with the headline:

“B.F. Skinner says: We Can’t Afford Freedom.”

Critics again accused him of showing totalitarian tendencies.

The Human Mind and Individual Consciousness

Skinner provoked the ire of the psychiatric profession by saying it is futile to try to examine inner feelings. Everything, he said, could be explained through behaviors. Consciousness itself is just a social product – our very thoughts are programmed by our genetic history and our environment. What you might think of as your ‘self’ as an individual, unique human being is actually a menu of programmed behaviors appropriate to the situation you find yourself in.

He argued that to change the ‘inner self’ needs environmental and behavioral changes rather than trying to look inwards to our minds.

The Greatest 20th Century Psychologist

In 2002, the academic journal Review of General Psychology considered the impact of individual psychologists in the 20th century based on journal citations, introductory psychology textbook citations, and responses to a reader survey. Skinner ranked at number 1, followed by Jean Piaget, and Sigmund Freud.

Some Personal Details and the End

In 1936, just before beginning work at the University of Minnesota, Skinner married Yvonne Blue. They had two daughters: Julie and Deborah.

When not working, Skinner relaxed by playing piano, painting, and reading books – he particularly enjoyed mystery stories.

He officially retired from his professorship at Harvard in 1974, age 70, but continued working until his death.

B.F. Skinner died age 86 on August 18, 1990 in Cambridge, Massachusetts from complications of leukemia. He was survived by his wife and daughters. He was buried in Mount Auburn Cemetery, Cambridge, Massachusetts. His wife Yvonne was buried alongside him after her death in 1997.

Author of this page: The Doc Images digitally enhanced and colorized by this website. © All rights reserved.

Cite this Page

Please use the following MLA compliant citation:

Published by FamousScientists.org

Further Reading B.F. Skinner Behavior of Organisms Appleton-Century-Crofts, 1938

B.F. Skinner Walden Two Macmillan Co., 1948

B.F. Skinner Beyond Freedom and Dignity Knopf, 1971

B.F. Skinner Particulars of my Life Knopf, 1976

Daniel W. Bjork B.F. Skinner: A Life BasicBooks, 1993

More from FamousScientists.org:

francis bacon

Alphabetical List of Scientists

Louis Agassiz | Maria Gaetana Agnesi | Al-Battani Abu Nasr Al-Farabi | Alhazen | Jim Al-Khalili | Muhammad ibn Musa al-Khwarizmi | Mihailo Petrovic Alas | Angel Alcala | Salim Ali | Luis Alvarez | Andre Marie Ampère | Anaximander | Carl Anderson | Mary Anning | Virginia Apgar | Archimedes | Agnes Arber | Aristarchus | Aristotle | Svante Arrhenius | Oswald Avery | Amedeo Avogadro | Avicenna

Charles Babbage | Francis Bacon | Alexander Bain | John Logie Baird | Joseph Banks | Ramon Barba | John Bardeen | Charles Barkla | Ibn Battuta | William Bayliss | George Beadle | Arnold Orville Beckman | Henri Becquerel | Emil Adolf Behring | Alexander Graham Bell | Emile Berliner | Claude Bernard | Timothy John Berners-Lee | Daniel Bernoulli | Jacob Berzelius | Henry Bessemer | Hans Bethe | Homi Jehangir Bhabha | Alfred Binet | Clarence Birdseye | Kristian Birkeland | James Black | Elizabeth Blackwell | Alfred Blalock | Katharine Burr Blodgett | Franz Boas | David Bohm | Aage Bohr | Niels Bohr | Ludwig Boltzmann | Max Born | Carl Bosch | Robert Bosch | Jagadish Chandra Bose | Satyendra Nath Bose | Walther Wilhelm Georg Bothe | Robert Boyle | Lawrence Bragg | Tycho Brahe | Brahmagupta | Hennig Brand | Georg Brandt | Wernher Von Braun | J Harlen Bretz | Louis de Broglie | Alexander Brongniart | Robert Brown | Michael E. Brown | Lester R. Brown | Eduard Buchner | Linda Buck | William Buckland | Georges-Louis Leclerc, Comte de Buffon | Robert Bunsen | Luther Burbank | Jocelyn Bell Burnell | Macfarlane Burnet | Thomas Burnet

Benjamin Cabrera | Santiago Ramon y Cajal | Rachel Carson | George Washington Carver | Henry Cavendish | Anders Celsius | James Chadwick | Subrahmanyan Chandrasekhar | Erwin Chargaff | Noam Chomsky | Steven Chu | Leland Clark | John Cockcroft | Arthur Compton | Nicolaus Copernicus | Gerty Theresa Cori | Charles-Augustin de Coulomb | Jacques Cousteau | Brian Cox | Francis Crick | James Croll | Nicholas Culpeper | Marie Curie | Pierre Curie | Georges Cuvier | Adalbert Czerny

Gottlieb Daimler | John Dalton | James Dwight Dana | Charles Darwin | Humphry Davy | Peter Debye | Max Delbruck | Jean Andre Deluc | Democritus | René Descartes | Rudolf Christian Karl Diesel | Diophantus | Paul Dirac | Prokop Divis | Theodosius Dobzhansky | Frank Drake | K. Eric Drexler

John Eccles | Arthur Eddington | Thomas Edison | Paul Ehrlich | Albert Einstein | Gertrude Elion | Empedocles | Eratosthenes | Euclid | Eudoxus | Leonhard Euler

Michael Faraday | Pierre de Fermat | Enrico Fermi | Richard Feynman | Fibonacci – Leonardo of Pisa | Emil Fischer | Ronald Fisher | Alexander Fleming | John Ambrose Fleming | Howard Florey | Henry Ford | Lee De Forest | Dian Fossey | Leon Foucault | Benjamin Franklin | Rosalind Franklin | Sigmund Freud | Elizebeth Smith Friedman

Galen | Galileo Galilei | Francis Galton | Luigi Galvani | George Gamow | Martin Gardner | Carl Friedrich Gauss | Murray Gell-Mann | Sophie Germain | Willard Gibbs | William Gilbert | Sheldon Lee Glashow | Robert Goddard | Maria Goeppert-Mayer | Thomas Gold | Jane Goodall | Stephen Jay Gould | Otto von Guericke

Fritz Haber | Ernst Haeckel | Otto Hahn | Albrecht von Haller | Edmund Halley | Alister Hardy | Thomas Harriot | William Harvey | Stephen Hawking | Otto Haxel | Werner Heisenberg | Hermann von Helmholtz | Jan Baptist von Helmont | Joseph Henry | Caroline Herschel | John Herschel | William Herschel | Gustav Ludwig Hertz | Heinrich Hertz | Karl F. Herzfeld | George de Hevesy | Antony Hewish | David Hilbert | Maurice Hilleman | Hipparchus | Hippocrates | Shintaro Hirase | Dorothy Hodgkin | Robert Hooke | Frederick Gowland Hopkins | William Hopkins | Grace Murray Hopper | Frank Hornby | Jack Horner | Bernardo Houssay | Fred Hoyle | Edwin Hubble | Alexander von Humboldt | Zora Neale Hurston | James Hutton | Christiaan Huygens | Hypatia

Ernesto Illy | Jan Ingenhousz | Ernst Ising | Keisuke Ito

Mae Carol Jemison | Edward Jenner | J. Hans D. Jensen | Irene Joliot-Curie | James Prescott Joule | Percy Lavon Julian

Michio Kaku | Heike Kamerlingh Onnes | Pyotr Kapitsa | Friedrich August Kekulé | Frances Kelsey | Pearl Kendrick | Johannes Kepler | Abdul Qadeer Khan | Omar Khayyam | Alfred Kinsey | Gustav Kirchoff | Martin Klaproth | Robert Koch | Emil Kraepelin | Thomas Kuhn | Stephanie Kwolek

Joseph-Louis Lagrange | Jean-Baptiste Lamarck | Hedy Lamarr | Edwin Herbert Land | Karl Landsteiner | Pierre-Simon Laplace | Max von Laue | Antoine Lavoisier | Ernest Lawrence | Henrietta Leavitt | Antonie van Leeuwenhoek | Inge Lehmann | Gottfried Leibniz | Georges Lemaître | Leonardo da Vinci | Niccolo Leoniceno | Aldo Leopold | Rita Levi-Montalcini | Claude Levi-Strauss | Willard Frank Libby | Justus von Liebig | Carolus Linnaeus | Joseph Lister | John Locke | Hendrik Antoon Lorentz | Konrad Lorenz | Ada Lovelace | Percival Lowell | Lucretius | Charles Lyell | Trofim Lysenko

Ernst Mach | Marcello Malpighi | Jane Marcet | Guglielmo Marconi | Lynn Margulis | Barry Marshall | Polly Matzinger | Matthew Maury | James Clerk Maxwell | Ernst Mayr | Barbara McClintock | Lise Meitner | Gregor Mendel | Dmitri Mendeleev | Franz Mesmer | Antonio Meucci | John Michell | Albert Abraham Michelson | Thomas Midgeley Jr. | Milutin Milankovic | Maria Mitchell | Mario Molina | Thomas Hunt Morgan | Samuel Morse | Henry Moseley

Ukichiro Nakaya | John Napier | Giulio Natta | John Needham | John von Neumann | Thomas Newcomen | Isaac Newton | Charles Nicolle | Florence Nightingale | Tim Noakes | Alfred Nobel | Emmy Noether | Christiane Nusslein-Volhard | Bill Nye

Hans Christian Oersted | Georg Ohm | J. Robert Oppenheimer | Wilhelm Ostwald | William Oughtred

Blaise Pascal | Louis Pasteur | Wolfgang Ernst Pauli | Linus Pauling | Randy Pausch | Ivan Pavlov | Cecilia Payne-Gaposchkin | Wilder Penfield | Marguerite Perey | William Perkin | John Philoponus | Jean Piaget | Philippe Pinel | Max Planck | Pliny the Elder | Henri Poincaré | Karl Popper | Beatrix Potter | Joseph Priestley | Proclus | Claudius Ptolemy | Pythagoras

Adolphe Quetelet | Harriet Quimby | Thabit ibn Qurra

C. V. Raman | Srinivasa Ramanujan | William Ramsay | John Ray | Prafulla Chandra Ray | Francesco Redi | Sally Ride | Bernhard Riemann | Wilhelm Röntgen | Hermann Rorschach | Ronald Ross | Ibn Rushd | Ernest Rutherford

Carl Sagan | Abdus Salam | Jonas Salk | Frederick Sanger | Alberto Santos-Dumont | Walter Schottky | Erwin Schrödinger | Theodor Schwann | Glenn Seaborg | Hans Selye | Charles Sherrington | Gene Shoemaker | Ernst Werner von Siemens | George Gaylord Simpson | B. F. Skinner | William Smith | Frederick Soddy | Mary Somerville | Arnold Sommerfeld | Hermann Staudinger | Nicolas Steno | Nettie Stevens | William John Swainson | Leo Szilard

Niccolo Tartaglia | Edward Teller | Nikola Tesla | Thales of Miletus | Theon of Alexandria | Benjamin Thompson | J. J. Thomson | William Thomson | Henry David Thoreau | Kip S. Thorne | Clyde Tombaugh | Susumu Tonegawa | Evangelista Torricelli | Charles Townes | Youyou Tu | Alan Turing | Neil deGrasse Tyson

Harold Urey

Craig Venter | Vladimir Vernadsky | Andreas Vesalius | Rudolf Virchow | Artturi Virtanen | Alessandro Volta

Selman Waksman | George Wald | Alfred Russel Wallace | John Wallis | Ernest Walton | James Watson | James Watt | Alfred Wegener | John Archibald Wheeler | Maurice Wilkins | Thomas Willis | E. O. Wilson | Sven Wingqvist | Sergei Winogradsky | Carl Woese | Friedrich Wöhler | Wilbur and Orville Wright | Wilhelm Wundt

Chen-Ning Yang

Ahmed Zewail

  • skip to content

Learning Theories

  • Recent Changes
  • Media Manager

Table of Contents

What is operant conditioning, what is the practical meaning of operant conditioning, keywords and most important names, bibliography, operant conditioning.

Operant conditioning, sometimes also known as Skinnerian conditioning or radical behaviorism is a behaviorist learning approach similar to classical conditioning , mostly influenced by early theoretical and experimental works of American psychologist Burrhus Frederic Skinner from the 1950s. Main difference between those two theories is that classical conditioning modifies only reflex reactions and operant conditioning shapes new behavior .

Skinner's box. Image borrowed from: WikiMedia.org. Click on the picture to follow the link.

The most famous experiment considering operant learning is Skinner box , also known as operant conditioning chamber . In one such experiment Skinner demonstrated the principles of operant conditioning and behavior shaping on a rat using reinforcement in terms of food. A starved rat was put in a box, in which pressing a small lever would release some food. The rat soon learned that pressing the lever would get him some food.

In another experiment, two lights (red and green) were introduced into the box and the rat would only get the food if one of them was on. The rat soon learned to discriminate between the lights, and stopped or reduced pressing the lever when the “wrong” light was on.

Unlike Pavlovian conditioning, where an existing behavior (salivating for food) is shaped by associating it with a new stimulus (sound of a bell), operant conditioning is the rewarding of an act that approaches a new desired behavior , but can also be the opposite: punishing undesirable behavior (negative reinforcement). 1)

After accidentally running short on rat food once, Skinner also started observing effects of different schedules of reinforcement 2) :

  • continuous - reinforcement occurs after every displayed behavior,
  • fixed ratio - reinforcement occurs every X trials,
  • fixed interval schedules - reinforcement occurs if desired behavior is shown within the specific time interval, or
  • variable schedules - number of required behavior displays in order to receive reinforcement is different every time.

An interesting observation he made was that if fixed interval is used, rats managed to find a “rhythm” in displaying of behavior, which was never the case in variable schedules. Variable schedules, surprisingly, have also shown to be very resistant to extinction. The gambling addiction offers another example for this: although reinforcement comes rarely, one can never be sure if it will or won't come the next time so he gives another try.

Operant conditioning can also be used to shape more complex behaviors by starting from an idea similar to the intended behavior and after it is learned slowly shaping it until it becomes exactly what was desired . An example of this is how Skinner and his students managed to teach pigeons to bowl. 3)

Some of his ideas Skinner incorporated in his book “Walden II”, about a behavior control based utopian society. He is also remembered for claiming that if his house was on fire, he would rather save his books than his children, since his writings could make greater contributions than his genes. 4)

There are many examples of operant conditioning in everyday use. The act of completing homework in order to get a reward from a teacher, or finishing projects to receive praise or promotions from the employer is a form of operant conditioning 5) . In these examples, the increased probability of certain behavior is the result of possibility of rewards .

Oppositely, operant conditioning can also be used to decrease probability of certain behavior by use of punishment ( averse stimulus ). For example, children in classroom may be told they will have to sit in the back of the classroom if they talk out of turn 6) . The possibility of punishment may decrease the probability of unwanted behaviors.

Criticisms of operant conditioning are similar to criticisms in general. Operant conditioning

  • ignores cognitive processes ,
  • assumes learning occurs only through reinforcement which is not true,
  • and overlooks genetic predispositions and species-specific behavior patterns which can interfere with it.
  • Operant conditioning , Skinnerian conditioning , radical behaviorism , Skinner box , operant conditioning chamber , schedules of reinforcement
  • Burrhus Skinner

Boeree, G. Personality theories: B. F. Skinner. Retrieved February 22, 2011.

Blackman, Derek E. Operant conditioning: an experimental analysis of behaviour. Routledge, 1974.

Skinner, Burrhus F. About behaviorism. Vintage Books, 1974.

Skinner, B. F. "Superstition" in the Pigeon. Journal of Experimental Psychology #38, p168-172. 1947.

Peterson, G. B. A day of great illumination: BF Skinner's discovery of shaping. Journal of the Experimental Analysis of Behavior 82, no. 3, p317-328. 2004.

Wolf, M., Risley, T., Johnston, M., Harris, F. and Allen, E. Application of operant conditioning procedures to the behavior problems of an autistic child: a follow-up and extension. Behaviour Research and Therapy 5, no. 2, p103-111. May 1967.

Levene, Howard I., Engel, Bernard T. and Pearson, John A. Differential Operant Conditioning of Heart Rate. Psychosom Med 30, no. 6, p837-845. November 1, 1968.

  • Show pagesource
  • Old revisions
  • Export to PDF
  • Back to top

CC Attribution-Share Alike 4.0 International

Famous Scientist

B. F. Skinner

Burrhus Frederic Skinner was an American psychologist famous for his work in the area of behavioral psychology.

bf skinner famous experiment

Burrhus Frederic Skinner was born on March 20th, 1904, in the town of Susquehanna, Pennsylvania. Skinner’s father was a lawyer while his mother stayed at home to take care of the household. He had a younger brother who died from a cerebral aneurism at the age of 16. Since young age, Skinner enjoyed working with his hands. In his spare time he would build anything from roller scooters to more complex devices.

B. F. Skinner attended elementary and high school in his home town of Susquehanna. Afterward he attended Hamilton College in New York with the intention of becoming a writer. While fulfilling his ambition to be a writer, he attended the Middlebury School of English in Vermont, where he met Robert Frost and wrote his first book. Soon after, Skinner realized that he didn’t have much to offer as a writer, so he turned his focus to psychology. Upon receiving his bachelors degree, Skinner attended Harvard college where he earned his Ph.D.

Achievements

While a researcher at Harvard, B. F. Skinner invented what he called “the operant conditioning chamber”, or popularly known as the Skinner box. The whole contraption was consisted of a chamber where the subject would be inserted (usually a rat or a pigeon) and a lever which would dispense food when activated. During his experimentation with rats, Skinner noted that the subject rat would usually sniff around and explore the chamber. While doing so, the rat would accidentally pull the lever and a drop of food would fall in. Skinner noticed that following the first drop of food, the subject would utilize the lever increasingly to the point where he is no longer hungry. From this experiment, Skinner derived that if the consequences were bad, there was a high chance that the action would not be repeated; however if the consequences were good, the actions that lead to it would be reinforced. He called this “ the principle of reinforcement ” which is the foundation of behaviorism. Following this experiment, Skinner began to shape his own philosophy of science called radical behaviorism, and founded his own school of experimental research psychology—the experimental analysis of behavior which culminated in his work called Verbal Behavior. For his contributions to  behavioral psychology , B. F. Skinner is considered to be a pioneer of modern behaviorism and the most influential psychologist of the 20th century.

In 1936, Skinner married Yvonne Blue and had two daughters, Julie and Deborah. He died of leukemia on August 18th, 1990, and was buried in Mount Auburn Cemetery, Cambridge, Massachusetts.

B. F. Skinner quotes

“We must delegate control of the population as a whole to specialists——to police, priests, teachers, therapies, and so on, with their specialized reinforcers and their codified contingencies”

“Education is what survives when what has been learnt has been forgotten”

“It is a mistake to suppose that the whole issue is how to free man. The issue is to improve the way in which he is controlled”

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Sweepstakes
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

The Behavioral Psychology Theory That Explains Learned Behavior

Aka the Skinner box

Bettmann Archive / Getty Images

A Skinner box is an enclosed apparatus that contains a bar or key that an animal subject can manipulate in order to obtain reinforcement. Developed by B. F. Skinner and also known as an operant conditioning chamber, this box also has a device that records each response provided by the animal as well as the unique schedule of reinforcement that the animal was assigned. Common animal subjects include rats and pigeons.

Skinner was inspired to create his operant conditioning chamber as an extension of the puzzle boxes that Edward Thorndike famously used in his research on the law of effect . Skinner himself did not refer to this device as a Skinner box, instead preferring the term "lever box."

How a Skinner Box Works

The design of a Skinner box can vary depending upon the type of animal and the experimental variables . It must include at least one lever, bar, or key that the animal can manipulate.

When the lever is pressed, food, water, or some other type of reinforcement might be dispensed. Other stimuli can also be presented, including lights, sounds, and images. In some instances, the floor of the chamber may be electrified.

The Skinner box is usually enclosed, to keep the animal from experiencing other stimuli. Using the device, researchers can carefully study behavior in a very controlled environment. For example, researchers could use the Skinner box to determine which schedule of reinforcement led to the highest rate of response in the study subjects.

Today, psychology students may use a virtual version of a Skinner box to conduct experiments and learn about operant conditioning.

The Skinner Box in Research

Imagine that a researcher wants to determine which schedule of reinforcement will lead to the highest response rates. Pigeons are placed in chambers where they receive a food pellet for pecking at a response key. Some pigeons receive a pellet for every response (continuous reinforcement).

Partial Reinforcement Schedules

Other pigeons obtain a pellet only after a certain amount of time or number of responses have occurred (partial reinforcement). There are several types of partial reinforcement schedules.

  • Fixed-ratio schedule : Pigeons receive a pellet after they peck at the key a certain number of times; for example, they would receive a pellet after every five pecks.
  • Variable-ratio schedule : Subjects receive reinforcement after a random number of responses.
  • Fixed-interval schedule : Subjects are given a pellet after a designated period of time has elapsed; for example, every 10 minutes.
  • Variable-interval schedule : Subjects receive a pellet at random intervals of time.

Once the data has been obtained from the trials in the Skinner boxes, researchers can then look at the rate of responding. This will tell them which schedules led to the highest and most consistent level of responses.

Skinner Box Myths

The Skinner box should not be confused with one of Skinner's other inventions, the baby tender (also known as the air crib). At his wife's request, Skinner created a heated crib with a plexiglass window that was designed to be safer than other cribs available at that time. Confusion over the use of the crib led to it being confused with an experimental device, which led some to believe that Skinner's crib was actually a variation of the Skinner box.

At one point, a rumor spread that Skinner had used the crib in experiments with his daughter, leading to her eventual suicide. The Skinner box and the baby tender crib were two different things entirely, and Skinner did not conduct experiments on his daughter or with the crib. Nor did his daughter take her own life.  

A Word From Verywell

The Skinner box is an important tool for studying learned behavior. It has contributed a great deal to our understanding of the effects of reinforcement and punishment.

Operant conditioning chamber . In: APA Dictionary of Psychology. American Psychological Association.

B.F. Skinner Foundation. Biographical information .

Schacter DL, Gilbert DT, Wegner DM. Psychology. 2nd edition. Worth, Inc., 2011.

Ray RD, Miraglia KM. A sample of CyberRat and other experiments: Their pedagogical functions in a learning course . J Behav Neurosci Res . 2011;9(2):44-61.

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

  • svg]:fill-accent-900">

These 1950s experiments showed us the trauma of parent-child separation. Now experts say they’re too unethical to repeat—even on monkeys.

By Eleanor Cummins

Posted on Jun 22, 2018 7:00 PM EDT

10 minute read

John Gluck’s excitement about studying parent-child separation quickly soured. He’d been thrilled to arrive at the University of Wisconsin at Madison in the late 1960s, his spot in the lab of renowned behavioral psychologist Harry Harlow secure. Harlow had cemented his legacy more than a decade earlier when his experiments showed the devastating effects of broken parent-child bonds in rhesus monkeys. As a graduate student researcher, Gluck would use Harlow’s monkey colony to study the impact of such disruption on intellectual ability.

Gluck found academic success, and stayed in touch with Harlow long after graduation. His mentor even sent Gluck monkeys to use in his own laboratory. But in the three years Gluck spent with Harlow—and the subsequent three decades he spent as a leading animal researcher in his own right—his concern for the well-being of his former test subjects overshadowed his enthusiasm for animal research.

Separating parent and child, he’d decided, produced effects too cruel to inflict on monkeys.

Since the 1990s, Gluck’s focus has been on bioethics; he’s written research papers and even a book about the ramifications of conducting research on primates. Along the way, he has argued that continued lab experiments testing the effects of separation on monkeys are unethical. Many of his peers, from biology to psychology, agree. And while the rationale for discontinuing such testing has many factors, one reason stands out. The fundamental questions we had about parent-child separation, Gluck says, were answered long ago.

The first insights into attachment theory began with studious observations on the part of clinicians.

Starting in the 1910s and peaking in the 1930s, doctors and psychologists actively advised parents against hugging , kissing, or cuddling children on the assumption such fawning attention would condition children to behave in a manner that was weak, codependent, and unbecoming. This theory of “behaviorism” was derived from research like Ivan Pavlov’s classical conditioning research on dogs and the work of Harvard psychologist B.F. Skinner , who believed free will to be an illusion. Applied in the context of the family unit, this research seemed to suggest that forceful detachment on the part of ma and pa were essential ingredients in creating a strong, independent future adult. Parents were simply there to provide structure and essentials like food.

But after the end of World War II, doctors began to push back. In 1946, Dr. Benjamin Spock (no relation to Dr. Spock of Star Trek ) authored Baby and Child Care, the international bestseller, which sold 50 million copies in Spock’s lifetime. The book, which was based on his professional observation of parent-child relationships, advised against the behaviorist theories of the day. Instead, Spock implored parents to see their children as individuals in need of customized care—and plenty of physical affection.

At the same time, the British psychiatrist John Bowlby was commissioned to write the World Health Organization’s Maternal Care and Mental Health report. Bowlby had gained renowned before the war for his systematic study of the effects of institutionalization on children, from long-term hospital stays to childhoods confined to orphanages.

Published in 1951, Bowlby’s lengthy two-part document focused on the mental health of homeless children. In it, he brought together anecdotal reports and descriptive statistics to paint a portrait of the disastrous effects of the separation of children from their caretakers and the consequences of “deprivation” on both the body and mind. “Partial deprivation brings in its train acute anxiety, excessive need for love, powerful feelings of revenge, and, arising from these last, guilt and depression,” Bowlby wrote. Like Spock, this research countered behaviorist theories that structure and sustenance were all a child needed. Orphans were certainly fed, but in most cases they lacked love. The consequences, Bowlby argued, were dire—and long-lasting.

The evidence of the near-sanctity of parent-child attachment was growing thanks to the careful observation of experts like Spock and Bowlby. Still, many experts felt one crucial piece of evidence was missing: experimental data. Since the Enlightenment, scientists have worked to refine their methodology in the hopes of producing the most robust observations about the natural world. In the late 1800s, randomized, controlled trials were developed and in the 20th century came to be seen as the “gold standard” for research —a conviction that more or less continues to this day.

While Bowlby had clinically-derived data, he knew to advance his ideas in the wider world he would need data from a lab . But by 1947, the scientific establishment required informed consent for research participants (though notable cases like the Tuskegee syphilis study violated such rules into at least the 1970s). As a result, no one would condone forcibly separating parents and children for research purposes. Fortunately, Bowlby’s transatlantic correspondent, Harry Harlow, had another idea.

Over the course of his career, Harlow conducted countless studies of primate behavior and published more than 300 research papers and books. Unsurprisingly, in a 2002 ranking the impact of 20th century psychologists , the American Psychological Association named him the 26th most cited researcher of the era, below B.F. Skinner (1), but above Noam Chomsky (38). But the (ethically-fraught) experiments that cemented his status in Psychology 101 textbooks for good began in earnest only in the 1950s.

Around the time Bowlby published WHO report, Harlow began to push the psychological limits of monkeys in myriad ways—all in the name of science. He surgically altered their brains or beamed radiation through their skulls to cause lesions, and then watched the neurological effect, according to a 1997 paper by Gluck that spans history, biography, and ethics. He forced some animals to live in a “deep, wedge-shaped, stainless steel chambers… graphically called the ‘pit of despair'” in order to study the effect of such solitary confinement on the mind, Gluck wrote. But Harlow’s most well-known study, begun in the 1950s and carefully documented in pictures and videos made available to the public, centered around milk.

To test the truth of the behaviorist’s claims that things like food mattered more than affection, Harlow set up an experiment that allowed baby monkeys, forcibly separated from their mothers at birth, to choose between two fake surrogates. One known as the “iron maiden” was made only of wire, but had bottles full of milk protruding from its metal chest. The other was covered in a soft cloth, but entirely devoid of food. If behaviorists were right, babies should choose the surrogate who offered them food over the surrogate who offered them nothing but comfort.

As Spock or Bowlby may have predicted, this was far from the case.

“Results demonstrated that the monkeys overwhelmingly preferred to maintain physical contact with the soft mothers,” Gluck wrote. “It also was shown that the monkeys seemed to derive a form of emotional security by the very presence of the soft surrogate that lasted for years, and they ‘screamed their distress’ in ‘abject terror’ when the surrogate mothers were removed from them.” They visited the iron maiden when they were too hungry to avoid her metallic frame any longer.

As anyone in behavioral psychology will tell you, Harlow’s monkey studies are still considered foundational for the field of parent-child research to this day. But his work is not without controversy. In fact, it never has been. Even when Harlow was conducting his research, some of his peers criticized the experiments , which they considered to be cruel to the animal and degrading to the scientists who executed them. The chorus of dissenting voices is not new; it’s merely grown.

Animal research today is more carefully regulated by individual institutions, professional organizations like the American Psychological Association and legislation like the Federal Animal Welfare Act. Many activists and scholars argue research on primates should end entirely and that experiments like Harlow’s should never be repeated. “Academics should be on the front lines of condemning such work as well, for they represent a betrayal of the basic notions of dignity and decency we should all be upholding in our research, especially in the case of vulnerable populations in our samples—such as helpless animals or young children,” psychologist Azadeh Aalai wrote in Psychology Today .

Animal studies have not disappeared. Research on attachment in monkeys continues at the University of Wisconsin at Madison . But animal studies have declined. New methods—or, depending on how you look at it, old methods—have filled the void. Natural experiments and epidemiological studies, similar to the kind Bowlby employed, have added new insight into the importance of “tender age” attachment .

Romanian orphanages established after the fall of the Soviet Union have served as such a study site. The facilities, which have been described as “slaughterhouses of the soul” , have historically had great disparities between the number of children and the number of caregivers (25 or more kids to one adult), meaning few if any children received the physical or emotional care they needed. Many of the children who were raised in these environments have exhibited mental health and behavioral disorders as a result. It’s even had a physical effect, with neurological research showing a dramatic reduction in the literal size of their brains and low levels of brain activity as measured by electroencephalography, or EEG, machines.

Similarly, epidemiological research has tracked the trajectories of children in the foster care system in the United States and parts of Europe to see how they differ, on average, from youths in a more traditional home environment. They’ve shown that the risk of mental disorders , suicidal ideation and attempts , and obesity are elevated among these children. Many of these health outcomes appear to be even worse among children in an institutional setting , like a Romanian orphanage, than children placed in foster care, which typically offers kids more individualized attention.

Scientists rarely say no to more data. After all, the more observations and perspectives we have, the better we understand a given topic. But alternatives to animal models are under development and epidemiological methodologies are only growing stronger. As a result, we may be able to set some kinds of data—that data collected at the expense of humans or animal —aside.

When it comes to lab experiments on parent-child attachment, we may know everything we need to know—and have for more than 60 years. Gluck believes that testing attachment theory at the expense of primates should have ended with Harry Harlow. And he continues to hope people will come to see the irony inherent in harming animals to prove, scientifically, that human children deserve compassion.

“Whether it is called mother-infant separation, social deprivation, or the more pleasant sounding ‘nursery rearing,'” Gluck wrote in a New York Times op-ed in 2016, “these manipulations cause such drastic damage across many behavioral and physiological systems that the work should not be repeated.”

Latest in Mental Health

Why emdr trauma therapy is gaining popularity why emdr trauma therapy is gaining popularity.

By Laurel Niep / The Conversation

Social media bans could deny teenagers mental health help Social media bans could deny teenagers mental health help

By Daniel Chang / KFF Health News

B. F. Skinner: Lasting Influences in Education and Behaviorism

  • Reference work entry
  • First Online: 22 August 2024
  • pp 1023–1038
  • Cite this reference work entry

bf skinner famous experiment

  • Jeff L. Cranmore 2  

B. F. Skinner (1904–1990) was an American psychologist, thinker, and innovator. He has been ranked as among the most influential thinkers of the twentieth century. In his life, he wrote 21 books, and over 180 articles ranging from human behavior to verbal behavior to the state of our world today. This does not include the numerous studies, innovations, branches of science, education, and therapies that can be traced back to his work. Even today, his work is referenced and utilized in multiple settings. While not all of Skinner’s ideas were accepted by his peers and contemporaries, there is no doubt he played a leading role in shaping the fields of education and psychology. This chapter cannot capture all of those ideas, but rather gives a broad overview of some of his major accomplishments, those impacts still felt today, and some of the criticism he faced. An in-depth look at his role on behaviorism and education is provided.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Baum, W. M. (2011). What is radical behaviorism? A review of Jay Moore’s conceptual foundations of radical behaviorism . Journal of the Experimental Analysis of Behavior, 95 (1), 119–126. https://doi.org/10.1901/jeab.2011.95-119

Google Scholar  

Benson, N. C. (2003). Introducing psychology . Icon Books.

Celestini, A. (2020). Serious games in higher distance education. Canadian Journal of Learning and Technology, 46 (3).

Cook, C. R., Collins, T., Dart, E., Vance, M. J., McIntosh, K., Grady, E. A., & DeCano, P. (2014). Evaluation of the class pass intervention for typically developing students with hypothesized escape-motivated disruptive classroom behavior. Psychology in the Schools, 51 (2), 107–125.

da F Passos, M. d. L. R. (2012). B. F. Skinner: The writer and his definition of verbal behavior. The Behavior Analyst, 35 (1), 115–126. https://doi.org/10.1007/BF03392270

DeSouza, A. A., Akers, J. S., & Fisher, W. W. (2017). Empirical application of Skinner’s verbal behavior to interventions for children with autism: A review. The Analysis of Verbal Behavior, 33 (2), 229–259. https://doi.org/10.1007/s40616-017-0093-7

Filcheck, H. A., & McNeil, C. B. (2004). The use of token economies in preschool classrooms: Practical and philosophical concerns. Journal of Early and Intensive Behavior Intervention, 1 (1), 94–104.

Fox, E. J., & Vanstelle, S. E. (2010). The impact of Skinner’s verbal behavior on organizational behavior management. Journal of Organizational Behavior Management, 30 (1), 70–81.

Guercio, J. M. (2018). The importance of a deeper knowledge of the history and theoretical foundations of behavior analysis: 1863–1960. Behavior Analysis: Research and Practice, 18 (1), 4–15.

Guercio, J. M. (2020). The importance of a deeper knowledge of the history and theoretical foundations of behaviorism and behavior therapy: Part 2—1960–1985. Behavior Analysis: Research and Practice, 20 (3), 174–195.

Guercio, J. M. (2022). The importance of a deeper knowledge of the history and theoretical foundations of behaviorism and behavior therapy: Part 3–1986–2021. Behavior Analysis: Research and Practice, 22 (1), 114–129.

Jensen, R., & Burgess, H. (1997). Mythmaking: How introductory psychology texts present B.F. Skinner’s analysis of cognition. Psychological Record, 47 (2), 221.

Jones, S. L., & Butman, R. E. (1991). Modern psychotherapies . Intervarsity Press.

Joyce, N., & Faye, C. (2010). ‘Skinner air crib’, observer . APS.

Ledoux, S. F. (2012). Behaviorism at 100. American Scientist, 100 (1), 60–65.

Lopes, C. E. (2020). Could Walden Two be an anarchist society? Behavior and Social Issues, 29 (1), 195–217.

Martin, J. (2017). Carl Rogers’ and B F Skinner’s approaches to personal and societal improvement: A study in the psychological humanities. Journal of Theoretical and Philosophical Psychology, 37 (4), 214–229.

Matos, M. A., & da F Passos, M. (2006). Linguistic sources of Skinner’s verbal behavior. The Behavior Analyst, 29 (1), 89–107. https://doi.org/10.1007/BF03392119

Mowrer, O. H. (1976). The present state of behaviorism. Education, 97 (1), 4–23.

Palmer, D. C. (2006). On Chomsky’s appraisal of Skinner’s verbal behavior : A half century of misunderstanding. The Behavior Analyst, 29 , 253–267.

Palmer, D. C. (2008). On Skinner’s definition of verbal behavior. International Journal of Psychology & Psychological Therapy, 8 (3), 295–307.

Picciano, A. G. (2017). Theories and frameworks for online education: Seeking an integrated model. Online Learning, 21 (3), 166–190.

Rutherford, A. (2017). B. F. Skinner and technology’s nation: Technocracy, social engineering, and the good life in 20th-century America. History of Psychology, 20 (3), 290–312.

Saari, A. (2019). Out of the box: Behaviourism and the mangle of practice. Discourse: Studies in the Cultural Politics of Education, 40 (1), 109–121.

Şahin-Sak, İ. T., Sak, R., & Tezel-Şahin, F. (2018). Preschool teachers’ views about classroom management models. Early Years: Journal of International Research & Development, 38 (1), 35–52.

Schlinger, H. D. (2008). The long good-bye: Why B.F. Skinner’s verbal behavior is alive and well on the 50th anniversary of its publication. The Psychological Record, 58 (3), 329.

Schlinger, H. D. (2021). The impact of B. F. Skinner’s science of operant learning on early childhood research, theory, treatment, and care. Early Child Development & Care, 191 (7/8), 1089–1106.

Schlinger, H. D., Jr. (2018). The heterodoxy of behavior analysis. Archives of Scientific Psychology, 6 (1), 159–168.

Schunk, D. H. (2008). Learning theories: An educational perspective (5th ed.). Pearson/Merrill Prentice Hall.

Skinner, B. F. (1953). Science and human behavior . Macmillan.

Skinner, B. F. (1957). Verbal behavior . Prentice-Hall. https://doi.org/10.1037/11256-000

Skinner, B. F. (1972). A lecture on “having” a poem. In B. F. Skinner (Ed.), Cumulative record (3rd ed., pp. 345–355). Appleton-Century-Crofts.

Skinner, B. F. (1976). Particulars of my life . Knopf.

Skinner, B. F. (1983). A matter of consequence . Knopf.

Slocum, T. A., Detrich, R., Wilczynski, S. M., Spencer, T. D., Lewis, T., & Wolfe, K. (2014). The evidence-based practice of applied behavior analysis. The Behavior Analyst, 37 (1), 41–56.

Smith, L. M. (1994). B.F. Skinner. Prospects, XXIV , 519–532. Retrieved from www.ibe.unesco.org/publications/ThinkersPdf/skinnere.PDF

Staddon, J. E., & Cerutti, D. T. (2003). Operant conditioning. Annual Review of Psychology, 54 , 115–144. https://doi.org/10.1146/annurev.psych.54.101601.145124

Sundberg, M. L., & Michael, J. (2001). The benefits of Skinner’s analysis of verbal behavior for children with autism. Behavior Modification, 25 (5), 698–724. https://doi.org/10.1177/0145445501255003

Wozniak, D. F. (1970). Review: [untitled]; the technology of teaching. International Review of Education/Internationale Zeitschrift Für Erziehungswissenschaft/Revue Internationale De l’Education, 16 (2), 244–245.

Further Reading

Bjork, D. W. (1997). BF Skinner: A life . American Psychological Association.

O’Donohue, W., & Ferguson, K. E. (2001). The psychology of BF Skinner . Sage.

Skinner, B. F. (1957). Verbal behavior . Prentice-Hall.

Skinner, B. F. (1976). Particulars of my life . McGraw-Hill.

Download references

Author information

Authors and affiliations.

Grand Canyon University, Phoenix, AZ, USA

Jeff L. Cranmore

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Jeff L. Cranmore .

Editor information

Editors and affiliations.

Educational Leadership, Western Michigan University, Kalamazoo, MI, USA

Brett A. Geier

Section Editor information

New Mexico State University, Las Cruces, NM, USA

Azadeh F. Osanloo Professor

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this entry

Cite this entry.

Cranmore, J.L. (2024). B. F. Skinner: Lasting Influences in Education and Behaviorism. In: Geier, B.A. (eds) The Palgrave Handbook of Educational Thinkers. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-25134-4_110

Download citation

DOI : https://doi.org/10.1007/978-3-031-25134-4_110

Published : 22 August 2024

Publisher Name : Palgrave Macmillan, Cham

Print ISBN : 978-3-031-25133-7

Online ISBN : 978-3-031-25134-4

eBook Packages : Education Reference Module Humanities and Social Sciences Reference Module Education

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

James Coplan, MD

Behaviorism

021. behaviorism, part 2. bf skinner, is all human behavior is the result of prior conditioning.

Posted September 6, 2010

bf skinner famous experiment

The views of Thorndike and Watson were refined by their most famous adherent, B.F. Skinner (1904-1990). Skinner built on Thorndike's "law of effect" - the tendency of whatever behavior immediately preceded an animal's escaping from the cage to increase in frequency. In Thorndike's work, the target behavior and the reward for that behavior were one and the same: escape from the cage. Skinner's genius lay in separating this process into two elements: There was no way to escape from one of Skinner's devices (which have become so popular with animal psychologists that they now go by the nickname "Skinner boxes"). Instead, Skinner provided the opportunity for the animal to obtain other rewards, such as a drink of water or a food pellet. His approach involved specifying the desired behavior beforehand , and then presenting a drink or a food pellet whenever that behavior happened to occur by chance . Relying on Thorndike's law of effect, whatever behavior immediately precedes doling out the reward goes up in frequency.

In one famous experiment, Skinner pushed a button, causing a food pellet to drop into a pigeon's cage, whenever the bird inadvertently raised its head for a second or two. Getting a food pellet was a pleasurable experience, which tended to increase the likelihood that the immediately preceding behavior (in this case, head-raising) would recur. Before long, the pigeon was keeping its head raised continuously. In Skinner's terms, the pigeon's spontaneous, initially random behavior (head raising) operated on the environment - with a little behind-the-scenes help from Skinner - resulting in the release of food. Skinner therefore described his procedure as operant conditioning .

Using the technique of operant conditioning, Skinner could shape the naturally-occurring behavior of his subjects in all sorts of ways. It wasn't necessary to believe that the subject "understood" what was happening. And in the case of pigeons, that's probably true. It's unlikely that the pigeon is saying to itself "Hmmm....it seems that I get a reward every time I raise my head." But behaviorists - at least, "radical behaviorists, such as Skinner - deny the existence of "understanding," as we commonly use the word, for humans as well as pigeons: A behaviorist would say that our subjective perception of the "Aha!" phenomenon, when we suddenly "get the idea," or "figure out what's going on" is just a fiction, a product of successive trials that have reinforced a stimulus-response pattern. (Skinner even attempted to reduce all of language to what he termed "Verbal Behavior." More on this in a later post.) Fortunately, we do not need to embrace radical behaviorism, in order to reap the considerable benefits of operant conditioning as an instructional technique for children with ASD.

Go here for an interesting short documentary segment on Skinner: http://www.youtube.com/watch?v=mm5FGrQEyBY

Go here for a modern demonstration of the skinner box: http://brembs.net/operant/skinnerbox.html

--------------------------------------------------------------------------------------------------

Follow Dr. Coplan on the web: http://www.DrCoplan.com and on Facebook (James Coplan, MD - Developmental Pediatrician / Autistic Spectrum Disorders)

James Coplan, MD

Dr. James Coplan is a developmental pediatrician, with four decades of experience caring for children with special needs and their families.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

AT THE SMITHSONIAN

B.f. skinner’s pigeon-guided rocket.

On this date 21 years ago, noted psychologist and inventor B.F. Skinner died; the American History Museum is home to one of his more unusual inventions

Joseph Stromberg

Joseph Stromberg

Nose Cone from B.F. Skinner's Pigeon-Guided Missile, on display in "Science in American Life."

It’s 1943, and America desperately needs a way to reliably bomb targets in Nazi Germany. What do we do? For B.F. Skinner, noted psychologist and inventor, the answer was obvious: pigeons.

“During World War II, there was a grave concern about aiming missiles,” says Peggy Kidwell , a curator of Medicine and Science at the American History Museum. “Military officials really wanted to figure out how to aim them accurately,”  Skinner approached the National Research Defense Committee with his plan, code-named “Project Pigeon.” Members of the committee were doubtful, but granted Skinner $25,000 to get started.

Skinner had already used pigeons in his psychological research, training them to press levers for food. An obsessive inventor, he had been pondering weapons targeting systems one day when he saw a flock of birds maneuvering in formation in the sky. “Suddenly I saw them as ‘devices’ with excellent vision and extraordinary maneuverability,” he said . “Could they not guide a missile? Was the answer to the problem waiting for me in my own back yard?”

Getting to work, Skinner decided on pigeons because of both their vision and unflappable behavior in chaotic conditions. He built a nose cone for a missile fitted with three small electronic screens and three tiny pigeon cockpits. Onto the screens was projected an image of the ground in front of the rocket.

“He would train street pigeons to recognize the pattern of the target, and to peck when they saw this target,” says Kidwell. “And then when all three of them pecked, it was thought you could actually aim the missile in that direction.” As the pigeons pecked, cables harnessed to each one’s head would mechanically steer the missile until it finally reached its mark. Alas, without an escape hatch, the birds would perish along with their target, making it a kamikaze mission.

Despite a successful demonstration of the trained pigeons, officials remained skeptical and eventually decided to terminate the project. Skinner, of course, would go on to become one of the country’s most influential psychologists, popularizing behaviorism, a conception of psychology that views behavior as a reaction to one’s environment.

He also kept inventing. As part of his research, Skinner designed a number of devices that used feedback processes to encourage learning. “After the war, he became very interested in machines for teaching people to do things,” says Kidwell. “In 1954, he had this machine for teaching arithmetic to young people, and in 1957 he designed a machine for teaching Harvard students basic natural sciences.”

Although Skinner’s machines were purely mechanical, the ideas he developed have been incorporated into many educational software programs in recent years, including some used in distance learning settings. “Many of his ideas are now most frequently seen by people as they have been incorporated in electronic testing. That programmed learning, where you have a series of questions, and responses, and based on the response you gave you are directed to the next question, is very much in a Skinnerian framework,” Kidwell says.

Skinner’s missile prototype, along with other teaching machines, came to the Smithsonian at the end of his career. “Skinner was a teacher of Uta C. Merzbach, who was a curator in this museum,” says Kidwell. “They had a very good relationship, so when he was writing his autobiography, when he had finished writing about a particular machine, he would give it to the museum.” The American History Museum is home to several Skinner teaching machines , as well as the missile, which is on display in the “ Science in American Life ” exhibition.

As for the pigeons? Skinner held on to them, and just out of curiosity, occasionally tested them  to see if their skills were still sharp enough for battle. One, two, four, and even six years later, the pigeons were still pecking strong.

Get the latest on what's happening At the Smithsonian in your inbox.

Joseph Stromberg

Joseph Stromberg | | READ MORE

Joseph Stromberg was previously a digital reporter for Smithsonian .

COMMENTS

  1. Skinner's Box Experiment (Behaviorism Study)

    Burrhus Frederic Skinner, also known as B.F. Skinner is considered the "father of Operant Conditioning.". His experiments, conducted in what is known as "Skinner's box," are some of the most well-known experiments in psychology. They helped shape the ideas of operant conditioning in behaviorism.

  2. Operant Conditioning In Psychology: B.F. Skinner Theory

    B.F Skinner is regarded as the father of operant conditioning and introduced a new term to behavioral psychology, reinforcement. ... Skinner's Pigeon Experiment. B.F. Skinner conducted several experiments with pigeons to demonstrate the principles of operant conditioning. One of the most famous of these experiments is often colloquially ...

  3. B. F. Skinner

    Skinner was born in Susquehanna, Pennsylvania, to Grace and William Skinner, the latter of whom was a lawyer.Skinner became an atheist after a Christian teacher tried to assuage his fear of the hell that his grandmother described. [14] His brother Edward, two and a half years younger, died at age 16 of a cerebral hemorrhage. [15]Skinner's closest friend as a young boy was Raphael Miller, whom ...

  4. B.F. Skinner: The Man Who Taught Pigeons to Play Ping-Pong and Rats to

    March 20, 2013. Psychologist B.F. Skinner taught these pigeons to play ping-pong in 1950. Photo via Psychology Pictures. B.F Skinner, a leading 20th century psychologist who hypothesized that ...

  5. B. F. Skinner: Theory & Experiments

    B. F. Skinner's Famous Pigeon Experiment. In this experiment, Skinner made pigeons "superstitious." Pigeons were placed in boxes with a lever that, at random intervals, delivered food pellets. As the experiment went on, the pigeons began to increase the frequency of whatever random behavior they were doing just before they got the food ...

  6. Skinner's theory on Operant Conditioning

    B.F. Skinner is famous for his pioneering research in the field of learning and behavior. He proposed the theory to study complex human behavior by studying the voluntary responses shown by an organism when placed in the certain environment. ... B.F. Skinner also conducted an experiment that explained negative reinforcement. Skinner placed a ...

  7. B. F. Skinner

    Although he originally intended to make a career as a writer, Skinner received his Ph.D. in psychology from Harvard in 1931, and stayed on as a researcher until 1936, when he departed to take academic posts at the University of Minnesota and Indiana University. He returned to Harvard in 1948 as a professor, and was the Edgar Pierce Professor of ...

  8. B. F. Skinner: Biography and Theories

    B. F. Skinner was born on March 20, 1904. He went on to become an influential psychologist who first described the learning process known as operant conditioning.Skinner played a pivotal role in behaviorism, a school of thought that suggested that all behavior was learned through conditioning processes.. Skinner referred to himself as a radical behaviorist because he believed that psychology ...

  9. Operant Conditioning: What It Is, How It Works, and Examples

    Operant conditioning, sometimes referred to as instrumental conditioning, is a learning method that employs rewards and punishments for behavior. Through operant conditioning, an association is made between a behavior and a consequence (whether negative or positive) for that behavior. For example, when lab rats press a lever when a green light ...

  10. Operant Conditioning

    Operant (instrumental) conditioning [ 1] is the process by which we learn about the consequences of our actions, e.g., not to touch a hot plate. The most famous operant conditioning experiment involves the "Skinner-Box" in which the psychologist B.F. Skinner trained rats to press a lever for a food reward. The animals were placed in the box ...

  11. Operant Conditioning

    Operant conditioning is a theory of learning in behavioral psychology which emphasises the role of reinforcement in conditioning. It emphasises the effect that rewards and punishments for specific behaviors can have on a person's future actions. The theory was developed by the American psychologist B. F. Skinner following experiments beginning in the 1930s, which involved the use of an ...

  12. B.F. Skinner: Biography of the Influential Behaviorist

    Awards and Recognitions. Publications. B.F. Skinner (1904-1990) was an American psychologist known for his impact on behaviorism. In a 2002 survey of psychologists, he was identified as the most influential psychologist of the 20th century. Skinner himself referred to his philosophy as "radical behaviorism."

  13. 4 Interesting Experiments by B.F. Skinner

    Experiment #3: Pigeon Ping-Pong. But Skinner wasn't only concerned with teaching pigeons how to read. It seems he also made sure they had time to play games as well. In one of his more whimsical experiments, B. F. Skinner taught a pair of common pigeons how to play a simplified version of table tennis.. The pigeons in this experiment were placed on either side of a box and were taught to ...

  14. B.F. Skinner

    Burrhus Frederic Skinner was born on March 20, 1904, in the small town of Susquehanna, Pennsylvania, where he also grew up. His father was a lawyer and his mother stayed home to care for Skinner ...

  15. B.F. Skinner

    programmed learning. B.F. Skinner (born March 20, 1904, Susquehanna, Pennsylvania, U.S.—died August 18, 1990, Cambridge, Massachusetts) was an American psychologist and an influential exponent of behaviourism, which views human behaviour in terms of responses to environmental stimuli and favours the controlled, scientific study of responses ...

  16. B. F. Skinner

    Lived 1904 - 1990. B.F. Skinner was the 20th century's most influential psychologist, pioneering the science of behaviorism. Inventor of the Skinner Box, he discovered the power of positive reinforcement in learning, and he designed the first psychological experiments to give quantitatively repeatable and predictable results. Skinner wrote a bestselling novel for the general public

  17. Operant conditioning

    The most famous experiment considering operant learning is Skinner box, also known as operant conditioning chamber. In one such experiment Skinner demonstrated the principles of operant conditioning and behavior shaping on a rat using reinforcement in terms of food. ... Peterson, G. B. A day of great illumination: BF Skinner's discovery of ...

  18. Operant conditioning chamber

    Skinner box. An operant conditioning chamber (also known as a Skinner box) is a laboratory apparatus used to study animal behavior.The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University.The chamber can be used to study both operant conditioning and classical conditioning. [1] [2]Skinner created the operant conditioning chamber as a ...

  19. B. F. Skinner

    Burrhus Frederic Skinner was born on March 20th, 1904, in the town of Susquehanna, Pennsylvania. Skinner's father was a lawyer while his mother stayed at home to take care of the household. He had a younger brother who died from a cerebral aneurism at the age of 16. Since young age, Skinner enjoyed working with his hands.

  20. Understanding Behavioral Psychology: the Skinner Box

    A Skinner box is an enclosed apparatus that contains a bar or key that an animal subject can manipulate in order to obtain reinforcement. Developed by B. F. Skinner and also known as an operant conditioning chamber, this box also has a device that records each response provided by the animal as well as the unique schedule of reinforcement that ...

  21. These 1950s experiments showed us the trauma of parent-child separation

    Unsurprisingly, in a 2002 ranking the impact of 20th century psychologists, the American Psychological Association named him the 26th most cited researcher of the era, below B.F. Skinner (1), but ...

  22. B. F. Skinner: Lasting Influences in Education and Behaviorism

    B. F. Skinner (1904-1990) was an American psychologist, thinker, and innovator. He has been ranked as among the most influential thinkers of the twentieth century. ... Early experiments by Skinner continue to be replicated with similar results, ensuring that the theory of operant conditioning remains an important piece of modern psychology ...

  23. 021. Behaviorism, part 2. BF Skinner

    BF Skinner Is all human behavior is the result of prior conditioning? Posted September 6, 2010 Share. Tweet ... In one famous experiment, Skinner pushed a button, causing a food pellet to drop ...

  24. B.F. Skinner's Pigeon-Guided Rocket

    Members of the committee were doubtful, but granted Skinner $25,000 to get started. Skinner had already used pigeons in his psychological research, training them to press levers for food. An ...