operant conditioning

7
This post was republished to My College Class Notes at 2:04:46 AM 5/28/2008 Operant Conditioning Account My College Class Notes Category Psychology Notes Chapter 4: p149 Focus Questions What is operant conditioning and how did Skinner study it? Operant behavior “operates” on the environment in accord with contingencies . Operant conditioning is base on contingencies that are arranged in the lab or occur in real life. The controlled environment of the Skinner box revolutionized the study of learning and conditioning. Parallels between classical and operant conditioning occur in areas extinction and spontaneous recovery, as well as stimulus generalization and discrimination. Shaping and successive approximations is an efficient procedure for training subjects to perform specific behaviors. An operant is a class of behaviors—not a specific behavior. What basic terms and procedures are involved in operant conditioning? The first half of Thorndike’s law of effect corresponds to positive reinforcement and negative reinforcement: the second half corresponds to positive punishment and negative punishment . In operant conditioning contingencies, positive means that a stimulus is presented or “added” and negative means that a stimulus is removed or Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 1

Upload: joseph-eulo

Post on 13-Nov-2014

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Operant Conditioning

This post was republished to My College Class Notes at 2:04:46 AM 5/28/2008

Operant ConditioningAccountCategory

Chapter 4: p149

Focus Questions

What is operant conditioning and how did Skinner study it?Operant behavior “operates” on the environment in accord with contingencies. Operant conditioning is base on contingencies that are arranged in the lab or occur in real life. The controlled environment of the Skinner box revolutionized the study of learning and conditioning.

Parallels between classical and operant conditioning occur in areas extinction and spontaneous recovery, as well as stimulus generalization and discrimination.

Shaping and successive approximations is an efficient procedure for training subjects to perform specific behaviors.

An operant is a class of behaviors—not a specific behavior.

What basic terms and procedures are involved in operant conditioning?The first half of Thorndike’s law of effect corresponds to positive reinforcement and negative reinforcement: the second half corresponds to positive punishment and negative punishment. In operant conditioning contingencies, positive means that a stimulus is presented or “added” and negative means that a stimulus is removed or subtracted. The effect on behavior is then determined by whether the stimulus is appetitive or aversive.

OPERANT CONDITIONING: DEFINITIONS

Contingency: The relationship between behavior and its consequences.

Operant conditioning The imposition of contingences, either deliberate or natural.

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 1

Page 2: Operant Conditioning

Shaping and successive approximationsa procedure for quickly establishing a contingency, such as bar pressing by rats or key pecking by pigeons, by rewarding successive approximations to the target behavior.

Positive reinforcement:An operant conditioning contingency in which behavior is strengthened because it results in presentation of an appetitive stimulus; also known as reward training.

Negative reinforcement:An operant conditioning contingency in which behavior is strengthened because it results in removal of an aversive stimulus; also known as escape or active avoidance training.

Positive punishmentan operant conditioning contingency in which behavior is weakened or suppressed because it results in presentation of an appetitive stimulus; also known as reward training

Negative punishmentan operant conditioning contingency in which behavior is weakened or suppressed because it results in removal of an appetitive stimulus, also know as omission training.

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 2

Page 3: Operant Conditioning

The Skinner boxAmerican psychologist B. F. Skinner designed an apparatus, now called a Skinner box, that allowed him to formulate important principles of animal learning. An animal placed inside the box is rewarded with a small bit of food each time it makes the desired response, such as pressing a lever or pecking a key. A device outside the box records the animal’s responses.

The most forceful leader of behaviorism was B. F. Skinner, an American psychologist who began studying animal learning in the 1930s. Skinner coined the term reinforcement and invented a new research apparatus called the Skinner box for use in testing animals. Based on his experiments with rats and pigeons, Skinner identified a number of basic principles of learning. He claimed that these principles explained not only the behavior of laboratory animals, but also accounted for how human beings learn new behaviors or change existing behaviors. He concluded that nearly all behavior is shaped by complex patterns of reinforcement in a person’s environment, a process that he called operant conditioning (also referred to as instrumental conditioning). Skinner’s views on the causes of human behavior made him one of the most famous and controversial psychologists of the 20th century.

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 3

Page 4: Operant Conditioning

Chapter 4: Operant Conditioning Phenomena and Applications: p154Focus Questions

How are timing and consistency of operant conditioning contingencies important?Laboratory animals, young children, and others who lack cognitive and language skills cannot mediate delays in reinforcement or punishment, so delayed contingencies tend to be ineffective. Older children and adults can mediate, and so delays do not void effectiveness.

Learning is faster with continuous reinforcement, but partial reinforcement produces behavior that is more resistant to extinction. Parents who partially reinforce tantrum behavior eventually find it very difficult to eliminate.

Ratio schedules require a number of responses before reinforcement occurs; interval schedules require that an amount of time passes before reinforcement occurs.

Fixed schedules require a specific number of responses or amount of time; variable schedules require a number of responses or amount of time that varies around an average value.

Schedules of ReinforcementA Variable Ratio Schedule produces rewards irregularly. The criteria for reinforcement changes, it rotates around an average number of responses. The amount of work required per reinforcement varies somewhat randomly within certain limits (Carpenter, 1974).

Examples of Variable Ratio Schedules

1. A Slot machine yields returns on an irregular basis.

2. Pigeons will peck for hours at a rate of five times per second. The first reinforcement is given after pecking three times, then seven times, then five times, then four times and then one time.

Implementing a Variable Ratio Schedule will eliminate the post reinforcement pause. In a Variable Ratio Schedule, the average number of responses between reinforcement is predetermined by the trainer. A Variable Ratio-10 Schedule (VR-10) means that on the average, reinforcement follows every 10th desired behavior but it might come after only one desired behavior or after the 20th desired behaviors.

Slot machine gambling is typically under the control of Variable Ratio schedules in order to generate steady behavior or a steady response rate. The behavior of dropping coins in slot machines is maintained at a high, steady level by the payoff, which is delivered only after an unknown, varying number of coins have been deposited. VR Schedules leave the gambler guessing when the reward will come. Therefore the gambler continues to gamble that the payoff (the reinforcement) will come after the next deposit of the coin (Rachlin, 1990).

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 4

Page 5: Operant Conditioning

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 5

Page 6: Operant Conditioning

Interval Schedules mean positive reinforcement is delivered based on an amount of time must elapse and the next desired behavior emitted will be followed by positive reinforcement. On a Fixed Interval (FI) Schedule, positive reinforcement is delivered for the first response made after a fixed period of time. On an FI-10 Schedule, the individual must wait 10 seconds before the desired behavior will be reinforced. These schedules also generate post reinforcement pauses.

To eliminate post reinforcement pauses produced by Fixed Interval Schedules, switch to Variable Interval (VI) Schedules. For Variable Interval Schedules, the average interval is predetermined. For example, on a VI-20 schedule, reinforcement is delivered at an average rate of 1 every 20 seconds. This type of schedule generates a fairly steady response rate.

Summary

Operant Conditioning by B.F. Skinner examines how consequences influence subsequent behavior. Positive reinforcement strengthens desirable behavior; punishment is used to eliminate undesirable behavior. Shaping is used to mold new behavior. A Continuous Reinforcement Schedule is used initially when the desired behavior is new but the Intermittent Schedules are more practical once the new behavior has been shaped.

OPERANT CONDITIONING PHENOMENA AND APPLICATIONS: DEFINITIONS

Continuous reinforcement

Reinforcing every instance of a behavior.

Partial reinforcement

Reinforcing only some instances of a behavior

Ratio scheduleA partial reinforcement schedule in which reinforcement occurs only after a number of responses.

Interval scheduleA partial reinforcement schedule in reinforcement occurs only for the first response after an amount of time has elapsed.

Fixed schedule

A partial reinforcement schedule in which reinforcement occurs after a specific number of responses or for the first response after a specific amount of time.

Variable schedule

A partial reinforcement schedule in which reinforcement occurs after a varying number of responses or for the first response after a varying amount of time, in each case around some average value.

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 6

Page 7: Operant Conditioning

Superstitious behavior

Behavior that occurs and persist in the absence of any actual contingency.

Behavior modification

A technique for changing behavior based on operant conditioning principles; also called behavior therapy in clinical settings.

Token Economy

A behavior modification procedure in which adaptive behavior is reinforced with tokens that can later be exchanged for privileges and other rewards.

Instinctive Behavior

As defined by ethnologists, a behavior that occurs in all normal members of a species, in response to specific releasing stimuli, and in essentially the same way every time

Ethnology The study of instinctive behavior in the lab and in natural environments.

Crazy Joe’s Psych 101 Notes II Prof. T.R. Tharney: PSY101 Chapter 4: pp. 7