b. f. skinner's molecular interpretations

32
Jack Michael, WMU TxABA Houston Saturday, 3/5/05 B. F. Skinner's Molecular Interpretations

Upload: wade-hooper

Post on 01-Jan-2016

22 views

Category:

Documents


0 download

DESCRIPTION

B. F. Skinner's Molecular Interpretations. TxABA Houston Saturday, 3/5/05. Jack Michael , WMU. Molecular is usually contrasted with molar. - PowerPoint PPT Presentation

TRANSCRIPT

Jack Michael, WMU

TxABA

Houston

Saturday, 3/5/05

B. F. Skinner's Molecular Interpretations

This distinction is currently introduced in learning textbooks (e.g. Catania, 1998) in terms of

a. the reinforcement for avoidance responding w/o a warning S,b. why response frequency "matches" rfmt frequency--the

matching law.These issues arose around 1960-61.

Molecular interpretation: Behavior is due to consequences over a brief period following each response (and any long term effects are due to the accumulation of the immediate effects).

Molecular is usually contrasted with molar.

Molar interpretation: Behavior is due directly to consequences over the long term.

Under a given schedule of rfmt, it can be shown that at the moment of reinforcement a given set of stimuli [including those resulting from the recent behavior of the organism] will usually prevail. . .

Still Skinner was much concerned with the events immediately following reinforcement. Here is the statement in Schedules of Reinforcement (Ferster & Skinner, 1957, page 3)

The Skinner interpretations that I present today occurred before the molecular-molar contrast became explicit (Are Theories of Learning Necessary, 1950; Science and Human Behavior, 1953).

Reinforcement occurs in the presence of such stimuli, and the future behavior of the organism is in part controlled by them or by similar stimuli according to a well-established principle of operant discrimination.

Pigeon Operant Chamber

Rfmt unavailable: aperture light off, grain hopper down where grain cannot be accessed.

food aperture

grain hopper down

food aperture

aperture light

pecking key

key lights

key lights

Pigeon Operant Chamber

Rfmt available: aperture light on, grain hopper up. After 3 sec, light goes off and hopper goes back down.

food aperture

grain hopper up

food aperture

aperture light

pecking key

key lights

key lights

Pigeon Operant Chamberfood aperture

grain hopper

food aperture

aperture light

pecking key

key lights

key lights

Rfmt unavailable: aperture light off, grain hopper down where grain cannot be accessed.

Fixed interval scallop: Low rate immediately after reinforce-ment, then increasing up to the time the next rfmt is due, then low after rfmt, and so on.

time

tota

l res

pons

es

reinforcement

Skinner's explanation in terms of stimulus control: Because the stimulus conditions immediately after rfmt--food dust on the beak, residual effects of rapid head movements, etc.--have become an S∆ (S delta) for pecking because responses have never been reinforced in the presence of those stimulus conditions. *

10' 20' 30'

Fixed interval 10 min rfmt

Why the low rate after rfmt?

Because the bird knows that it will not get any grain for pecking after reinforcement? Cognitive explanation.

30 min extinction sessions

0 30minutes

Day 1

Day 2

Day 3

Response rate at the start of extinction session 2 is greater than at the end of ext. session 1. And greater at the start of extinction session 3 than at end of extinction session 2. And so on.

resp

onse

s end of extinction session 1

end of extinction session 2

start of extinction session 2

start of extinction session 3

Spontaneous Recovery: What is it?

*For a thorough treatment of spontaneous recovery and its relation to the concept of inhibition see Catania, 1998.

30 min extinction sessions

0 30minutes

resp

onse

s

Day 1

Day 2

Day 3

Spontaneous Recovery (cont'd.)Why does it happen?

Skinner offers another explanation based on operant stimulus control. (Skinner, 1950, p. 85)

Some theories (Pavlov, Hull, others) contended that responding without reinforce-ment generates a form of inhibition* (an hypothesized neurochemical substance, or a hypothetical entity of some sort), but with the passage of time this entity dissipates. *

Stimulus Change Decrement: After an operant function-altering operation (reinforcement, extinction, punishment, recovery from punishment, and others), the changed function is seen at its maximum value when the stimulus conditions are exactly the same as during the function-altering operation.

Any change from those conditions results in a decrement in the changed function.

When the changed function is an increase in responding due to reinforcement, then a stimulus change results in less behavior than if the stimuli were the same as during reinforcement.

When the changed function is a decrease due to extinction or punishment, then a stimulus change results in more behavior than if the stimuli were the same as during extinction or punishment.

• Nine pigeons were given a history of variable ratio (VR) reinforcement for pecking a yellow triangle.

• In the session at right the triangle is yellow for the first 30 minutes with more than 1100 rsps per bird*.

• When extinction starts key color is changed to red.

• After 15 minutes the color was changed back to yellow.

Demonstration of stimulus change decrement with respect to extinction

*Group data: Curve is based on the responses for all 9 birds.

VR rfmt

extinctionre

spon

ses

1800

0 time in minutes 60

• Phase 1: Pigeon is placed in the experimental chamber with red ceiling light flashing,

• but the light fades to off in 2 minutes.

• Key pecking gets VI 30 sec rfmt in presence and absence of the flashing red light.

Spontaneous recovery analogy: A hypothetical experiment that is more like the actual spontaneous recovery situation

300

0 10time in minutes

resp

onse

s

VI 30" Rfmt

intensity of flashing red light

bright

off

flashing light on but fading to off in 2 min

flashing light off

0 10time in minutes

intensity of red flashing light

high

off

time inminutes 100

extinction session

red light on again

• Phase 2: Ext. session, with flashing red light on at the beginning of the session but fades rapidly to off in 2 min, as during reinforcement sessions.

• There is thus only 2 min extinction in flashing light before it goes off,

• then about 8 min ext. with light off.

• Then after 10 minutes flashing light is turned on.

• Responding recovers as in Skinner's procedure.

Spontaneous recovery analogy (cont'd.)

intensity of stimuli fromhandling

high

off

time inminutes 100

extinction session

removed fromchamber,put back in the next day

spontaneous recovery

• At the beginning of reinforce-ment sessions there are residual stimulus effects from being removed from the home cage, transported to the experimental chamber, etc.

• These rapidly fade to off just like the flashing red light in the hypothetical experiment.

Spontaneous recovery analogy (cont'd.)

• During an extinction session there is only a small extinction history in the presence of the residuals of handling stimuli.

• There is much more extinction in the absence of these stimuli. • Thus more behavior occurs when they are again present at the

beginning of the next ext. session than were occurring at the end of the previous ext. session in their absence.

Somewhat controversial. Some studies (e.g. Welker &

McAuley, 1978) support Skinner's interpretation but some (e.g. Thomas & Sherman, 1986) do not.

But the analysis illustrates Skinner's broad interpretation of the stimulus, and his concern for environment-behavior details.

How About "Late" Spontaneous Recovery?Pigeons were given 1-hour reinforcement sessions, then very brief extinction sessions, until no responding occurred in the brief sessions. Then when a session lasted the usual duration, responding occurred in the later part of the session. Extinction had occurred in the presence of the stimuli of having just been placed in the chamber, but there had been no extinction in the presence of stimuli consisting of having been in the chamber for a while. (Kendall,1965).

Spontaneous Recovery (still cont'd.)

In the introduction to About Behaviorism (1974, pp.4-5) Skinner lists 20 common objections to behaviorism or to the science of behavior, all of which he asserts, and later argues, are wrong.

Objections 10 and 11 pertain to the move from the animal laboratory to human behavior.

Objections

10. It works with animals but not with people, therefore its picture of human behavior is confined to those features which humans share with animals.

11. Its achievements under laboratory control cannot be duplicated in daily life.

These are cited when we refer to the animal research literature as a basis for solving a problem in human behavior.

"The Analysis of Complex Cases" (Ch. 14, SHB)

A popular way to resist a behavioral approach is to cite a common event that contradicts a behavioral principle (with, I think, the hope that the whole behavioral thing will go away).

Such examples often depend on failure to recognize the multiple control of behavior. One type of multiple control consists in an independent variable having more than one effect on behavior.

For example a single occurrence of an aversive stimulus may

Elicit unconditioned respondent behavior (painful S elicits heart rate changes, GSR, pupillary dilation, etc.)

Respondently condition the organism so that a neutral S will have effects similar to those of the aversive S (make the neutral S into a CS for autonomic, and perceptual Rs).

Evoke any behavior that has in the past terminated similar aversive stimuli (function as an MO/EO).

Decrease the future frequency of any behavior that immediately precedes the occurrence of the aversive S (function as a punisher).

With some forms of reinforcement, the strength of the motivating/establishing operation is decreased as a function of consumption or contact with the reinforcer, and behavior evoked by that MO/EO becomes less frequent.

Food ingestion results in a decrease in food-reinforced behavior.

The critic says "But here is an example--giving a small child a piece of candy--where satiation doesn't work--so the principle must be invalid, (and perhaps we can forget about all this behavioral stuff!)."

The Principle of Satiation

A person gives a small piece of candy to a child who is playing happily by himself (the person has given candy before).

Giving a Child a Piece of Candy

Much objectionable behavior emerges--asking for more candy, crying if it is not provided, perhaps a temper tantrum.

We appear to have increased the relevant establishing operation, although our definition of satiation implies that we should have decreased it.

But the sight & taste of candy is a stimulus condition with another effect besides satiation.

It also functions as an SD (discriminative stimulus) for further asking. More than one piece of candy has usually been available at a time.

Now assume that repeated candy-seeking is unsuccessful, a situation which evokes emotional behavior.

Discriminative (SD), satiating, and emotional effects can be separated by never giving more than one piece of candy at a time.

Then one piece will not be an SD for further asking,

It should then be possible to demonstrate a small decrease in the evocative strength of the establishing operation.

and the emotional behavior will have extinguished (or never been reinforced in the first place).

Social Behavior (Ch. 19 of SHB)

Many social scientists believe that human social behavior requires its own special science.

From his behavioral perspective, Skinner argues that no social phenomena emerge that cannot be understood in terms of the way one person's behavior is affected by the behavior of another person, and vice versa.

As a conditioned eliciting stimulus (CS), an operant discriminative SD, a conditioned reinforcer (Sr), a conditioned punisher (Sp), and as a conditioned establishing operation (CEO).

The relations may be based on very complex contingency histories, but the contingencies don't differ in principle from those of the nonsocial environment.

The surprising power of an apparently trivial event is the common experience of catching someone's eye (in a flirtation, under amusing circumstances, at a moment of common guilt, and so on--Skinner's e.gs.

To counteract this view, social behavior is often described which seems beyond the scope of behavior analysis, for example, catching someone's eye.

The change in behavior which follows may be considerable. This has led to the belief that some nonphysical 'understanding' passes between persons.

But the rfmt history offers an alternative explanation. This is a stimulus that is very important because of the contingencies in which it is involved.

Catching Someone's Eye (cont'd.)• Our behavior may be very different in the presence or

absence of a particular person.• When we simply see such a person in a crowd, our available

repertoire immediately changes. • If in addition, we catch his eye, we fall under the control of

of an even more restrictive stimulus--he is not only present, he is watching us.

• When we catch his eye, we also know that he knows that we are looking at him. A much narrower repertoire of behavior is under the control of this specific stimulus: if we behave in a way which he censures, it will be not only in opposition to his wishes, but brazen.

• It may also be important that "we know that he knows that we know that he is looking at us" and so on.

• But there is nothing other than the current environment and the organism's histories regarding similar environments.

A few more examples of Skinner's concern for details

Operant conditioning as a possible self-control technique? Science and Human Behavior, 237-238

The remarkable properties of language that are related to its indirect reinforcement. Verbal Behavior, 204-206

Conditioned perceptual responses. Science and Human Behavior, 266-275

Discrete and continuous repertoires. Science and Human Behavior, 116-119

Thanks for your attention.

ReviewSpontaneous recovery was analyzed in terms of stimuli related to handling.

Giving a child a single piece of candy as a criticism of the principle of satiation, analyzed in terms of multiple control.

Catching someone's eye as an example of a social stimulus of surprising power, analyzed in terms of reinforcement contingencies.

All exemplify Skinner's interpretation of behavior in terms of the details of environment-behavior relations.

ReferencesCatania, A. C. (1998). Learning, fourth edition.New Jersey: Prentice Hall.Ferster, C. B., & Skinner, B. F. (1957)Schedules of reinforcement. New

York: Appleton-Century-Crofts.Kendall, S. F. (1965). Spontaneous recovery after extinction with periodic

time-outs. Psychonomic Science, 2, 117-118.Skinner, B. F. (1950). Are theories of learning necessary? Psychological

Review, 57, 193-216. (Page references are to Cumulative record, Definitive Edition, 1999.)

Skinner, B. F. (1953). Science and human behavior. New York: Macmillan.Skinner, B. F. (1957). Verbal behavior. New York: Appleton-Century-

Crofts.Skinner, B. F. (1974). About Behaviorism. New York: Knopf. Thomas, D. R., & Sherman, L. (1986). An assessment of the role of handling

cues in "spontaneous recovery" after extinction. Journal of the Experimental Analysis of Behavior, 46, 305-314.

Welker, R. L., & McAuley, K. (1978). Reduction in resistance to extinction and spontaneous recovery as a function of changes in transportational and contextual stimuli. Animal Learning and Behavior, 6, 451-457.

[Sidman,Herrnstein, Herrnstein and Hineman, Hineman, Dinsmoor ]

If you would like copies of the PowerPoint slides, go to the TxABA web sitewww.unt.edu/behv/txaba and then to a link called Conference Handouts, and download.

email address: <[email protected]>

Or if it is easier, email me and I will send the PowerPoint presentation as an attachment to a reply to your email. I will also attempt to answer any questions you might have regarding today's presentation.