Extinction Of The Operant Behavior example essay topic

1,496 words
B.F. Skinner Psychologist, born in Susquehanna, Pa. He studied at Harvard, teaching there (1931-6, 1947-74). A leading behaviorist, he is a proponent of operant conditioning, and the inventor of the Skinner box for facilitating experimental observations. B.F. Skinner's entire system is based on operant conditioning. The organism is in the process of "operating" on the environment, which in ordinary terms means it is bouncing around the world, doing what it does.

During this "operating", the organism encounters a special kind of stimulus, called a reinforcing stimulus, or simply a reinforcer. This special stimulus has the effect of increasing the operant - which is the behavior occurring just before the reinforcer. This is operant conditioning: "the behavior is followed by a consequence, and the nature of the consequence modifies the organisms tendency to repeat the behavior in the future". Say you have a dog and he's just playing around with his toys and such and then when you throw a toy at him and he catches it then you give him a treat. Then all of the sudden the dog is starting to catch toys and such as you throw it in the air or at his mouth. The operant is the behavior just prior to the reinforcer, which is the treat.

Then what if you decide to stop giving the dog treats, well he " ll stop his little trick which your, the owner were enjoying. This is called extinction of the operant behavior. Now, if you were to start showing the dog treats, then most likely he / she " ll want to start doing the tricks again and a little more quickly than the dog learned at first. This is because the return of the reinforcer takes place in the context of a reinforcement history that goes all the way back to the very first time the dog was reinforced for performing the tricks.

Continuous reinforcement is the original scenario: Every time that the dog does the behavior (such as performing a trick), he gets a treat. The fixed ratio schedule was the first one Skinner discovered: If the dog did the trick three times, say, he gets a goodie. Or five times. Or twenty times.

Or "x" times. There is a fixed ratio between behaviors and reinforcers: 3 to 1, 5 to 1, 20 to 1, etc. This is a little like "piece rate" in the clothing manufacturing industry: You get paid so much for so many shirts. Skinner also looked at variable schedules. Variable ratio means you change the "x" each time first it takes 3 tricks to get a treat, then 10, then 1, then 7 and so on.

Variable interval means you keep changing the time period -- first 20 seconds, then 5, then 35, then 10 and so on. In both cases, it keeps the dog on their little toes. With the variable interval schedule, they no longer "pace" themselves, because they no can no longer establish a "rhythm" between behavior and reward. Most importantly, these schedules are very resistant to extinction.

It makes sense, if you think about it. If you haven't gotten a reinforcer for a while, well, it could just be that you are at a particularly "bad" ratio or interval. A question Skinner had to deal with was how we get to more complex sorts of behaviors. He responded with the idea of shaping, or "the method of successive approximations". Basically, it involves first reinforcing a behavior only vaguely similar to the one desired.

Once that is established, you look out for variations that come a little closer to what you want, and so on, until you have the animal performing a behavior that would never show up in ordinary life. Skinner and his students have been quite successful in teaching simple animals to do some quite extraordinary things. I actually seen my mother use shaping on my brother once, when I now think about it. When we were younger, my brother was afraid of the dark so he liked to keep his bedroom door cracked open with the hall light shining through.

So my mother started closing the door just a little bit more every night and after about a couple of weeks, he was sleeping like a baby with the door closed and lights off. This is the same method that is used in the therapy called systematic desensitization, invented by another behaviorist named Joseph Wolfe. A person with a phobia -- say of spiders -- would be asked to come up with ten scenarios involving spiders and panic of one degree or another. The first scenario would be a very mild one -- say seeing a small spider at a great distance outdoors. The second would be a little more scary, and so on, until the tenth scenario would involve something totally terrifying -- say a tarantula climbing on your face while you " re driving your car at a hundred miles an hour!

The therapist will then teach you how to relax your muscles -- which is incompatible with anxiety. After you practice that for a few days, you come back and you and the therapist go through your scenarios, one step at a time, making sure you stay relaxed, backing off if necessary, until you can finally imagine the tarantula while remaining perfectly tension-free. Beyond these fairly simple examples, shaping also accounts for the most complex of behaviors. You don't, for example, become a brain surgeon by stumbling into an operating theater, cutting open someone's head, successfully removing a tumor, and being rewarded with prestige and a hefty paycheck, along the lines of the dog and treats. Instead, you are gently shaped by your environment to enjoy certain things, do well in school, take a certain bio class, see a doctor movie perhaps, have a good hospital visit, enter med school, be encouraged to drift towards brain surgery as a speciality, and so on. This could be something your parents were carefully doing to you.

But much more likely, this is something that was more or less unintentional. An aversive stimulus is the opposite of a reinforcing stimulus, something we might find unpleasant or painful. (A behavior followed by an aversive stimulus results in a decreased probability of the behavior occurring in the future.) This both defines an aversive stimulus and describes the form of conditioning known as punishment. If you swat your kid for doing x, it " ll do a lot less of x. If you spank Johnny for throwing his toys he will throw his toys less and less (maybe). On the other hand, if you remove an already active aversive stimulus after Johnny performs a certain behavior, you are doing negative reinforcement.

If you turn off the electricity when the rat stands on his hind legs, he " ll do a lot more standing. If you stop your perpetually nagging when I finally take out the garbage, I'll be more likely to take out the garbage (perhaps). You could say it "feels so good" when the aversive stimulus stops, that this serves as a reinforcer! Notice how difficult it can be to distinguish some forms of negative reinforcement from positive reinforcement: If I starve you, is the food I give you when you do what I want a positive a reinforcer? Or is it the removal of a negative the aversive stimulus of hunger? Skinner (contrary to some stereotypes that have arisen about behaviorists) doesn't "approve" of the use of aversive stimuli -- not because of ethics, but because they don't work well!

Notice that I said earlier that Johnny will maybe stop throwing his toys, and that I perhaps will take out the garbage? That's because whatever was reinforcing the bad behaviors hasn't been removed, as it would " ve been in the case of extinction. This hidden reinforcer has just been "covered up" with a conflicting aversive stimulus. So, sure, sometimes the child (or me) will behave -- but it still feels good to throw those toys. All Johnny needs to do is wait till you " re out of the room, or find a way to blame it on his brother, or in some way escape the consequences, and he's back to his old ways. B.F. Skinner made numerous contributions to the science of behavior. He strongly influenced the area of learning that he named operant conditioning.

His Skinner box is now a standard apparatus for the experimental study of animal behavior. Much of his work involved the study of how reinforcement schedules influence learning and behavior..