ABC/123 Version X

Professional Applications of Learning Theory in Real-Life Situations

PSYCH/635 Version 2


Chapter 3 Behaviorism

It’s the end of the school day at Park Lane Elementary, and three teachers leave the building together: Leo Battaglia, Shayna Brown, and Emily Matsui. Their conversation as they walk to the parking lot is as follows:

Leo: Boy, they were wild today. I don’t know what got into them. Hardly anyone earned any points today.
Emily: What points, Leo?
Leo: I give points for good behavior, which they then can exchange for privileges, such as extra free time. I take away points when they misbehave.
Emily: And it works?
Leo: Sure does. Keeps them in line most days. But not today. Maybe there was something in the water.
Shayna: Or in their heads, most likely. What do you suppose they were thinking about? Maybe spring break next week?
Leo: Perhaps. But it’s not my job to see into their heads. Lots of things can trigger wild behavior. How am I supposed to know what does? That’s why I focus on the behavior.
Shayna: But sometimes we need to go beyond the behavior. For example, Sean’s been acting out lately. If I had just focused on his behavior, I would not have learned that his parents are getting divorced and he’s blaming himself for it.
Leo: Isn’t that why we have a counselor? Isn’t that her job?
Shayna: Yes, it is, but we have a role, too. I think you focus too much on what you see and not enough on what you don’t see.
Leo: Perhaps, but at least I keep them under control with my system of rewards and punishments. I don’t waste a lot of time on classroom management issues.
Emily: Or on personal issues, like their thoughts and emotions.

Against the background of structuralism and functionalism ( Chapter 1 ), behaviorism began its rise to become the leading psychological discipline of the first half of the 20th century. John B. Watson (1878–1958), generally considered to be the founder and champion of behaviorism (Heidbreder,  1933 ; Hunt,  1993 ), believed that theories and research methods that dealt with the mind were unscientific. If psychology were to become a science, it had to structure itself along the lines of the physical sciences, which examined observable and measurable phenomena. Behavior was the proper material for psychologists to study (Watson,  1924 ). Introspection ( Chapter 1 ) was unreliable; conscious experiences were not observable and people having such experiences could not be trusted to report them accurately (Murray, Kilgour, & Wasylkiw,  2000 ).

Watson ( 1916 ) thought that Pavlov’s conditioning model (discussed later in this chapter) was appropriate for building a science of human behavior. He was impressed with Pavlov’s precise measurement of observable behaviors. Watson believed that Pavlov’s model could account for diverse forms of learning and personality characteristics. For example, newborns are capable of displaying three emotions: love, fear, and rage (Watson,  1926a ). Through Pavlovian conditioning, these emotions could become attached to stimuli to produce a complex adult life. Watson expressed his belief in the power of conditioning in this famous pronouncement:

·  Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select—a doctor, lawyer, artist, merchant-chief and, yes, even into beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations and race of his ancestors. (Watson,  1926b , p. 10)

Although Watson’s research has little relevance for academic learning, he spoke and wrote with conviction, and his adamant views influenced psychology from around 1920 until the early 1960s (Hunt,  1993 ). His emphasis on the importance of the environment is readily seen in the ensuing work of Skinner (discussed later in this chapter; Horowitz,  1992 ).

This chapter covers behaviorism as expressed in conditioning theories of learning. The hallmark of conditioning theories is not that they deal with behavior (all theories do that), but rather that they explain learning in terms of environmental events. While not denying the existence of mental phenomena, these theories contend that such phenomena are not necessary to explain learning. In the opening scenario, Leo espouses a conditioning theory position.

The best-known conditioning theory is B. F. Skinner’s  operant conditioning . Before discussing this theory, historical work on conditioning is presented to set the backdrop for Skinner’s theory: Thorndike’s connectionism, Pavlov’s classical conditioning, and Guthrie’s contiguous conditioning.

· When you finish studying this chapter, you should be able to do the following:

· ■ Explain how behaviors are learned according to connectionism theory.

· ■ Discuss some of Thorndike’s contributions to educational practice.

· ■ Explain how responses become conditioned, extinguished, and generalized, according to classical conditioning theory.

· ■ Describe a process whereby an emotional response might become conditioned to an initially neutral object.

· ■ Explain, using contiguous conditioning principles, how movements are combined to become an act.

· ■ Describe Skinner’s three-term contingency model of operant conditioning, and provide examples.

· ■ Define and exemplify key operant conditioning concepts: positive and negative reinforcement, punishment, generalization, discrimination, shaping, and the Premack Principle.

· ■ Explain how operant principles are reflected in educational applications: behavioral objectives, learning time, mastery learning, programmed instruction, and contingency contracts.


Edward L. Thorndike (1874–1949) was a prominent U.S. psychologist whose  connectionism  theory of learning was dominant in the United States for a long time (Mayer,  2003 ). Unlike many early psychologists, he was interested in education and especially learning, transfer, individual differences, and intelligence (Hilgard,  1996 ; McKeachie,  1990 ). He applied an experimental approach when measuring students’ achievement outcomes. His impact on education is reflected in the Thorndike Award, the highest honor given by the Division of Educational Psychology of the American Psychological Association for distinguished contributions to educational psychology.

Trial-and-Error Learning

Thorndike’s major work is the three-volume series Educational Psychology (Thorndike,  1913a  1913b  1914 ). He postulated that the most fundamental type of learning involves the forming of associations ( connections ) between sensory experiences (perceptions of stimuli or events) and neural impulses (responses) that manifest themselves behaviorally. He believed that learning often occurs by  trial and error  (selecting and connecting).

Thorndike began studying learning with a series of experiments on animals (Thorndike,  1911 ). Animals in problem situations try to attain a goal (e.g., obtain food, reach a destination). From among the many responses they can perform, they select one, perform it, and experience the consequences. The more often they make a response to a stimulus, the more firmly that response becomes connected to that stimulus. For example, a cat in a cage can open an escape hatch by pushing a stick. After a series of random responses, the cat eventually escapes by pushing the stick. Over trials, the cat reaches the goal (escape) quicker and makes fewer errors prior to responding correctly. A typical plot of results is shown in  Figure 3.1 .

Trial-and-error learning occurs gradually ( incrementally ). Connections are formed through repetition; conscious awareness is not necessary. Animals do not “catch on” or “have insight.” Thorndike understood that human learning is more complex because people engage in learning involving connecting ideas, analyzing, and reasoning (Thorndike,  1913b ). Nonetheless, the similarity in research results from animal and human studies led Thorndike to explain complex learning with elementary learning principles. An educated adult possesses millions of  stimulus–response  connections.

Principles of Learning

Laws of Exercise and Effect.

Thorndike’s basic ideas about learning are embodied in the Laws of Exercise and Effect. The  Law of Exercise  has two parts: the  Law of Use —a response to a stimulus strengthens their connection; the  Law of Disuse —when a response is not made to a stimulus, the connection’s strength is weakened (forgotten). The longer the time interval before a response is made, the greater is the decline in the connection’s strength.


Figure 3.1 Incremental performance over trials exemplifying Thorndike’s trial-and-error learning.

The  Law of Effect  emphasizes the consequences of behavior: Responses resulting in satisfying (rewarding) consequences are learned; responses producing annoying (punishing) consequences are not learned (Thorndike,  1913b ). This is a functional account of learning because satisfiers (responses that produce desirable outcomes) allow individuals to adapt to their environments.

The following study illustrates application of the Law of Effect (Thorndike,  1927 ). Participants were shown 50 strips of paper, ranging in length from 3 to 27 cm, one at a time. Next to each strip was a second strip that participants knew was 10 cm long. They initially estimated the length of each strip without feedback. Following this pretest, the 50 strips were presented again, one at a time. After each estimate, they were told “right” or “wrong” by the experimenter. After the 50 strips were presented repeatedly over several days, they again were presented without feedback about accuracy of length judgments. Following training, participants’ length estimates more closely approximated the actual lengths of the strips than had their prior estimates. Thorndike concluded that these results, which were similar to those from experiments in which animals were rewarded with food or freedom, support the idea that satisfying (correct) stimulus–response connections are strengthened and annoying (incorrect) ones are weakened.

Other Principles.

Thorndike’s ( 1913b ) theory included other principles relevant to education. The  Law of Readiness  states that when one is prepared (ready) to act, to do so is rewarding and not to do so is punishing. If one is hungry, responses that lead to food are in a state of readiness, whereas other responses not leading to food are not in a state of readiness. If one is fatigued, it is punishing to be forced to exercise. Applying this idea to learning, we might say that when students are ready to learn a particular action (in terms of developmental level or prior skill acquisition), then behaviors that foster this learning will be rewarding. When students are not ready to learn or do not possess prerequisite skills, then attempting to learn is punishing and a waste of time.

The principle of  associative shifting  refers to a situation in which responses made to a particular stimulus eventually are made to an entirely different stimulus if, on repeated trials, there are small changes in the nature of the stimulus. For example, to teach students to divide a two-digit number into a four-digit number, we first teach them to divide a one-digit number into a one-digit number and then gradually add more digits to the divisor and dividend.

The principle of  identical elements  affects  transfer  ( generalization ), or the extent that strengthening or weakening of one connection produces a similar change in another connection (Hilgard,  1996 ; Thorndike,  1913b ; see  Chapter 7 ). Transfer occurs when situations have identical (highly similar) elements and call for similar responses. Thorndike and Woodworth ( 1901 ) found that practicing a skill in a specific context did not improve one’s ability to execute that skill generally. Thus, training on estimating the area of rectangles does not advance learners’ ability to estimate the areas of triangles, circles, and irregular figures. Skills should be taught with different types of educational content for students to understand how to apply them ( Application 3.1 ).


APPLICATION 3.1 Facilitating Transfer

Thorndike suggested that drilling students on a specific skill does not help them master it nor does it teach them how to apply the skill in different contexts.

When teachers instruct students how to use map scales, they also must teach them to calculate miles from inches. Students become more proficient if they actually apply the skill on various maps and create maps of their own surroundings than if they are just given problems to solve.

Elementary teachers work with students on liquid and dry measurement. Having the students use a recipe to measure ingredients and create a food item is more meaningful than using pictures, charts, or filling cups with water or sand.

In teacher education courses, having students observe and become involved in actual classrooms is more meaningful than reading about and watching videos on teaching and learning.

Thorndike revised the Laws of Exercise and Effect after other research evidence did not support them (Thorndike,  1932 ). Thorndike discarded the Law of Exercise when he found that simple repetition of a situation does not necessarily “stamp in” responses. In one experiment, for example, participants closed their eyes and drew lines they thought were 2, 4, 6, and 8 inches long, hundreds of times over several days, without feedback on accuracy of the lengths (Thorndike,  1932 ). If the Law of Exercise were correct, then the response performed most often during the first 100 or so drawings ought to become more frequent afterward; but Thorndike found no support for this idea. Rather, mean lengths changed over time; people apparently experimented with different lengths because they were unsure of the correct length. In the absence of feedback, people are unlikely to perform the same behavior.

With respect to the Law of Effect, Thorndike originally thought that the effects of satisfiers (rewards) and annoyers (punishments) were opposite but comparable, but research showed this was not the case. Rather, rewards strengthened connections, but punishment did not necessarily weaken them (Thorndike,  1932 ). Instead, connections are weakened when alternative connections are strengthened. In one study (Thorndike,  1932 ), participants were presented with uncommon English words (e.g., edacious, eidolon). Each word was followed by five common English words, one of which was a correct synonym. On each trial, participants chose a synonym and underlined it, after which the experimenter said “right” (reward) or “wrong” (punishment). Reward improved learning, but punishment did not diminish the probability of that response occurring to that stimulus word.

Punishment suppresses responses, but they are not forgotten. Punishment is not an effective means of altering behavior because it does not teach students correct behaviors but rather informs them of what not to do. This also is true with cognitive skills. Brown and Burton ( 1978 ) found that students learn  buggy algorithms  (incorrect rules) for solving problems (e.g., subtract the smaller number from the larger, column by column, 4371 − 2748 = 2437). When students are informed that this method is incorrect and are given corrective feedback and practice in solving problems correctly, they learn the correct method but do not forget the old way.

Thorndike and Education

As a professor of education at Teachers College, Columbia University, Thorndike wrote books that addressed topics such as educational goals, learning processes, teaching methods, curricular sequences, and techniques for assessing educational outcomes (Hilgard,  1996 ; Mayer,  2003 ; Thorndike,  1906  1912 ; Thorndike & Gates,  1929 ). Some of Thorndike’s many contributions to education are the following.

Principles of Teaching.

Teachers should help students form good habits. As Thorndike ( 1912 ) noted:

· ■ Form habits. Do not expect them to create themselves.

· ■ Beware of forming a habit which must be broken later.

· ■ Do not form two or more habits when one will do as well.

· ■ Other things being equal, have a habit formed in the way in which it is to be used. (pp. 173–174)

The last principle cautions against teaching content that is removed from its applications: “Since the forms of adjectives in German or Latin are always to be used with nouns, they should be learned with nouns” (p. 174). Students need to understand how to apply knowledge and skills they acquire. Uses should be learned in conjunction with the content.

Sequence of Curricula.

· A skill should be introduced (Thorndike & Gates,  1929 ):

· ■ At the time or just before the time when it can be used in some serviceable way

· ■ At the time when the learner is conscious of the need for it as a means of satisfying some useful purpose

· ■ When it is most suited in difficulty to the ability of the learner

· ■ When it will harmonize most fully with the level and type of emotions, tastes, instinctive and volitional dispositions most active at the time

· ■ When it is most fully facilitated by immediately preceding learnings and when it will most fully facilitate learnings which are to follow shortly (pp. 209–210)

These principles conflict with typical content placement in schools, where content is segregated by subject (e.g., social studies, mathematics, science). But Thorndike and Gates ( 1929 ) urged that knowledge and skills be taught with different subjects ( Application 3.2 ). For example, forms of government are appropriate topics not only in civics and history, but also in English (how governments are reflected in literature) and foreign language (government structure in other countries).

Mental Discipline.

APPLICATION 3.2 Sequence of Curricula

Thorndike’s views on the sequence of curricula suggest that learning should be integrated across subjects. Mrs. Woleska prepared a unit on pumpkins for her second-grade class in the fall. The students studied the significance of pumpkins to the American colonists (history), where pumpkins currently are grown (geography), and the varieties of pumpkins grown (agriculture). They measured and charted the various sizes of pumpkins (mathematics), carved the pumpkins (art), planted pumpkin seeds and studied their growth (science), and read and wrote stories about pumpkins (language arts). This approach provides a meaningful experience for children and “real life” learning of various skills.

In developing a history unit on the Civil War, Ms. Parks went beyond covering factual material and incorporated comparisons from other wars, attitudes and feelings of the populace during that time period, biographies and personalities of individuals involved in the war, and the impact the war had on the United States and implications for the future. In addition, she worked with other teachers in the middle school to expand the unit by examining the terrain of major battlefields (geography), weather conditions during major battles (science), and the emergence of literature (language arts) and creative works (art, music, drama) during that time period.

Mental discipline  is the view that learning certain subjects (e.g., the classics, mathematics) enhances general mental functioning better than learning other subjects. Mental discipline was a popular view during Thorndike’s time. He tested this idea with 8,500 students in grades 9 to 11 (Thorndike,  1924 ). Students were given intelligence tests a year apart, and their programs of study that year were compared to determine whether certain courses were associated with greater intellectual gains. The results provided no support for mental discipline. Students who had greater ability to begin with made the best progress regardless of what they studied.

·  If our inquiry had been carried out by a psychologist from Mars, who knew nothing of theories of mental discipline, and simply tried to answer the question, “What are the amounts of influence of sex, race, age, amounts of ability, and studies taken, upon the gain made during the year in power to think, or intellect, or whatever our stock intelligence tests measure,” he might even dismiss “studies taken” with the comment, “The differences are so small and the unreliabilities are relatively so large that this factor seems unimportant.” The one causal factor which he would be sure was at work would be the intellect already existent. Those who have the most to begin with gain the most during the year. (Thorndike,  1924 , p. 95)

So rather than assuming that some subject areas improve students’ mental abilities better than others, we should assess how different subject areas affect students’ ability to think, as well as other outcomes (e.g., interests, goals). Thorndike’s influential research led educators to redesign curricula away from the mental discipline idea.


We have seen that events in the United States in the early 20th century helped establish psychology as a science and learning as a legitimate field of study. At the same time, there were important developments in other countries. One of the most significant was the work of Ivan Pavlov (1849–1936), a Russian physiologist who won the Nobel Prize in 1904 for his work on digestion.

Pavlov’s legacy to learning theory was his work on  classical conditioning  (Cuny,  1965 ; Hunt,  1993 ; Pavlov,  1927  1928 ; Windholz,  1997 ). While Pavlov was the director of the physiological laboratory at the Institute of Experimental Medicine in Petrograd, he noticed that dogs often would salivate at the sight of the attendant bringing them food or even at the sound of the attendant’s footsteps. Pavlov deduced that the attendant was not a natural stimulus for the reflex of salivating; rather, the attendant acquired this power by being associated with food.

Basic Processes

Classical conditioning is a multistep procedure that initially involves presenting an  unconditioned stimulus  ( UCS ), which elicits an  unconditioned response  ( UCR ). Pavlov presented a hungry dog with meat powder (UCS), which would cause the dog to salivate (UCR). To condition the animal requires repeatedly presenting an initially neutral stimulus immediately before presenting the UCS. Pavlov often used a ticking metronome as the neutral stimulus. In the early trials, the ticking of the metronome produced no salivation. Eventually, the dog salivated in response to the ticking metronome prior to the presentation of the meat powder. The metronome had become a  conditioned stimulus  ( CS ) that elicited a  conditioned response  ( CR ) similar to the original UCR ( Table 3.1 ). Repeated nonreinforced presentations of the CS (i.e., without the UCS) cause the CR to diminish in intensity and disappear, a phenomenon known as  extinction (Larrauri & Schmajuk,  2008 ; Pavlov,  1932b ).

Table 3.1 Classical conditioning procedure.

Phase Stimulus Response
1 UCS (food powder) UCR (salivation)
2 CS (metronome), then UCS (food powder) UCR (salivation)
3 CS (metronome) CR (salivation)

Spontaneous recovery  may occur after a time lapse in which the CS is not presented and the CR presumably extinguishes. If the CS then is presented and the CR returns, we say that the CR spontaneously recovered from extinction. Pairings of the CS with the UCS can restore the CR to full strength. The fact that CS–CR pairings can be reinstated without great difficulty suggests that extinction does not involve unlearning of the associations (Redish, Jensen, Johnson, & Kurth-Nelson,  2007 ).

Generalization  means that the CR occurs to stimuli similar to the CS ( Figure 3.2 ). Once a dog is conditioned to salivate in response to a metronome ticking at 70 beats per minute, it also may salivate in response to a metronome ticking faster or slower, as well as to ticking clocks or timers. The more dissimilar the new stimulus is to the CS or the fewer elements that they share, the less generalization occurs (Harris,  2006 ).


Figure 3.2 Generalization curve showing decreased magnitude of conditioned response as a function of increased dissimilarity with the conditioned stimulus.

Discrimination  is the complementary process that occurs when the dog learns to respond to the CS but not to other, similar stimuli. To train discrimination, an experimenter might pair the CS with the UCS and also present other, similar stimuli without the UCS. If the CS is a metronome ticking at 70 beats per minute, it is presented with the UCS, whereas other cadences (e.g., 50 and 90 beats per minute) are presented but not paired with the UCS.

Once a stimulus becomes conditioned, it can function as a UCS and  higher-order conditioning  can occur (Pavlov,  1927 ). If a dog has been conditioned to salivate at the sound of a metronome ticking at 70 beats per minute, the ticking metronome can function as a UCS for higher-order conditioning. A new neutral stimulus (such as a buzzer) can be sounded for a few seconds, followed by the ticking metronome. If, after a few trials, the dog begins to salivate at the sound of the buzzer, the buzzer has become a second-order CS. Conditioning of the third order involves the second-order CS serving as the UCS and a new neutral stimulus being paired with it. Pavlov ( 1927 ) reported that conditioning beyond the third order is difficult.

Higher-order conditioning is a complex process that is not well understood (Rescorla,  1972 ). The concept is theoretically interesting and might help to explain why some social phenomena (e.g., test failure) can cause conditioned emotional reactions, such as stress and anxiety. Early in life, failure may be a neutral event, but it usually becomes associated with disapproval from parents and teachers that may be an UCS eliciting anxiety. Through conditioning, failure can elicit anxiety. Cues associated with the situation also can become conditioned stimuli. Students may feel anxious when they walk into a room where they will take a test or when a teacher passes out a test.

CSs capable of producing CRs are called  primary signals . People have a  second signal system —language—that greatly expands the potential for conditioning (Windholz,  1997 ). Words or thoughts are labels denoting events or objects and can become CSs. Thinking about a test or listening to a teacher discuss a forthcoming test may cause anxiety. Thus, tests do not make students anxious but rather their linguistic representations or meanings.

Informational Variables

Pavlov believed that conditioning is an automatic process that occurs with repeated CS–UCS pairings and that nonpairings extinguish the CR. In humans, however, conditioning can occur rapidly, sometimes after only a single CS–UCS pairing. Repeated nonpairings of the CS and UCS may not extinguish the CR. Extinction seems highly context dependent (Bouton, Nelson, & Rosas,  1999 ). Reponses stay extinguished in the same context, but when the setting is changed, CRs may recur. Further, conditioning cannot occur between any two variables. Within any species, responses can be conditioned to some stimuli but not to others. Conditioning depends on the compatibility of the stimulus and response with species-specific reactions (Hollis,  1997 ). These findings call into question Pavlov’s description of conditioning.

Research subsequent to Pavlov has shown that conditioning depends less on the CS–UCS pairing and more on the extent that the CS conveys information about the likelihood of the UCS occurring (Rescorla,  1972  1976 ). As an illustration, assume that one stimulus always is followed by a UCS and another stimulus sometimes is followed by it. The first stimulus should result in conditioning because it reliably predicts the onset of the UCS. It even may not be necessary to pair the CS and UCS; conditioning can occur by simply telling people that they are related (Brewer,  1974 ). Likewise, repeated CS–UCS nonpairings may not be necessary for extinction; telling people the contingency is no longer in effect can reduce or extinguish the CR.

An explanation for these results is that people form expectations concerning the probability of the UCS occurring (Rescorla,  1987 ). For a stimulus to become a CS, it must convey information to the individual about the time, place, quantity, and quality of the UCS. Even when a stimulus is predictive, it may not become conditioned if another stimulus is a better predictor. Rather than conditioning being automatic, it appears to be mediated by cognitive processes. If people do not realize there is a CS–UCS link, conditioning does not occur. When no CS–UCS link exists, conditioning can occur if people believe it does. Although this contingency view of conditioning may not be entirely accurate (Papini & Bitterman,  1990 ), it provides a different explanation for conditioning than Pavlov’s and highlights its complexity.

Conditioned Emotional Reactions

Pavlov ( 1932a  1934 ) applied classical conditioning principles to abnormal behavior (e.g., neuroses). His views were speculative and unsubstantiated, but classical conditioning principles have been applied by others to condition emotional reactions.

Watson claimed to demonstrate the power of emotional conditioning in the well-known Little Albert experiment (Watson & Rayner,  1920 ). Albert was an infant who showed no fear of a white rat when he was tested between the ages of 8 and 11 months. The conditioning involved a hammer being struck against a steel bar behind Albert as he reached out for the rat. “The infant jumped violently and fell forward, burying his face in the mattress” (p. 4). This sequence was immediately repeated. One week later when the rat was presented, Albert began to reach out but then withdrew his hand. The previous week’s conditioning was apparent. Tests over the next few days showed that Albert reacted emotionally to the rat’s presence. There also was generalization of fear to a rabbit, dog, and fur coat. When Albert was retested a month later with the rat, he showed a mild emotional reaction.

This study is widely cited as showing how conditioning can produce emotional reactions, but there are questions about the study’s validity. Recent evidence suggests that Albert was neurologically impaired (Bartlett,  2012 ). With this deficit, his reactions to the white rat would not be typical of a healthy child. Albert died at the age of 6 years of hydrocephalus (Beck, Levinson, & Irons,  2009 ), a condition he apparently had from birth. He never learned to walk or talk and had vision problems. Drawing conclusions from this study and generalizing its results seem problematic.

Further, the influence of conditioning usually is not that powerful (Harris,  1979 ). Classical conditioning is a complex phenomenon; one cannot condition any response to any stimulus. Species have evolved mechanisms predisposing them to being conditioned in some ways and not in others (Hollis,  1997 ). Among humans, conditioning occurs when people are aware of the relation between the CS and the UCS, and information that the UCS may not follow the CS can produce extinction. Attempts to replicate Watson and Rayner’s findings were not uniformly successful (Valentine,  1930a ).

A more reliable means of producing emotional conditioning is with  systematic desensitization , which is often used with individuals who possess debilitating fears (Wolpe,  1958 ; see  Application 3.3 ). Desensitization comprises three phases. In the first phase, a therapist and client jointly develop an anxiety hierarchy of several situations graded from least-to-most anxiety-producing for the client. For a test-anxious student, low-anxiety situations might be hearing a test announcement in class and gathering together materials to study. Situations of moderate anxiety might be studying the night before the test and walking into class on the day of the test. High-anxiety situations could include receiving the test and not knowing the answer to a question.

In the second phase, the client learns to relax by imagining pleasant scenes (e.g., lying on a beach) and cuing relaxation (saying “relax”). In the third phase, the client, while relaxed, imagines the lowest (least-anxious) scene on the hierarchy. This may be repeated several times, after which the client imagines the next scene. Treatment proceeds up the hierarchy until the client can imagine the most anxiety-producing scene without feeling anxious. If the client reports anxiety while imagining a scene, the client drops back down the hierarchy to a scene that does not produce anxiety. Treatment may require several sessions.

APPLICATION 3.3 Emotional Conditioning

Principles of classical conditioning seem relevant to some emotions. Children entering kindergarten or first grade may be fearful. At the beginning of the school year, primary teachers might develop procedures to help desensitize the fears. Visitation sessions allow students to meet their teachers and other students and see their classrooms. On the first few days of school, teachers might plan fun but relatively calm activities involving students getting to know their teachers, classmates, rooms, and school buildings. Students could tour the buildings, return to their rooms, and draw pictures. They might talk about what they saw. Students can be taken to offices to meet the principal, assistant principal, nurse, and counselor. They also could play name games in which they introduce themselves and then try to recall names of classmates.

These activities represent an informal desensitization procedure. For some children, cues associated with the school serve as stimuli eliciting anxiety. The fun activities elicit pleasurable feelings, which are incompatible with anxiety. Pairing fun activities with cues associated with school may cause the latter to become less anxiety producing.

Education students may be anxious about teaching an entire class. Anxieties should be lessened when students spend time in classrooms and gradually assume more responsibility for instruction. Pairing classroom and teaching experiences with formal study can desensitize fears related to being responsible for children’s learning.

Some drama students have stage fright. Drama teachers may work with students to lessen these anxieties by practicing more on the actual stage and by opening up rehearsals to allow others to watch. Practice in performing in front of others should help diminish their fears.

Desensitization involves counterconditioning. The relaxing scenes that one imagines (UCS) produce relaxation (UCR). Anxiety-producing cues (CS) are paired with the relaxing scenes. Relaxation is incompatible with anxiety. By initially pairing a weak anxiety cue with relaxation and by slowly working up the hierarchy, all of the anxiety-producing cues eventually should elicit relaxation (CR).

Desensitization is an effective procedure that can be accomplished in a therapist’s or counselor’s office. It does not require the client to perform the activities on the hierarchy. A disadvantage is that the client must be able to imagine scenes. People differ in their ability to form mental images. Desensitization also requires the skill of a professional therapist or counselor and should not be attempted by anyone unskilled in its application.


Acts and Movements

Edwin R. Guthrie (1886–1959) postulated behavioral learning principles based on associations (Guthrie,  1940 ). These principles reflect the idea of contiguity of stimuli and responses:

·  A combination of stimuli which has accomplished a movement will on its recurrence tend to be followed by that movement. (Guthrie,  1952 , p. 23)

Movements  are discrete behaviors, whereas  acts  are large-scale classes of movements that produce an outcome. Playing the piano and using a computer are acts that include many movements. A particular act may be accompanied by a variety of movements; the act may not specify the movements precisely. In basketball, shooting a basket (an act) can be accomplished with a variety of movements.

Contiguity learning implies that a behavior in a situation will be repeated when that situation recurs (Guthrie,  1959 ); however, contiguity learning is selective. At any given moment, a person is confronted with many stimuli, and associations cannot be made to all of them. Rather, only a few stimuli are selected, and associations are formed between them and responses. The contiguity principle also applies to memory. Verbal cues are associated with stimulus conditions or events at the time of learning (Guthrie,  1952 ).  Forgetting  involves new learning and is due to interference in which an alternative response is made to an old stimulus.

Guthrie’s theory contends that learning occurs through pairing of stimulus and response. Guthrie ( 1942 ) also discussed the strength of the pairing, or  associative strength :

·  A stimulus pattern gains its full associative strength on the occasion of its first pairing with a response. (p. 30)

This  all-or-none  principle of learning rejects the notion of frequency, as embodied in Thorndike’s original Law of Exercise (Guthrie,  1930 ). Although Guthrie did not suggest that people learn complex behaviors (e.g., solving equations, writing papers) by performing them once, he believed that initially one or more movements become associated. Repetition of a situation adds movements, combines movements into acts, and establishes the act under different environmental conditions.

Practice links the various movements involved in the acts of solving equations and writing papers. The acts themselves may have many variations (types of equations and papers) and ideally should transfer—students should be able to solve equations and write papers in different contexts. Guthrie accepted Thorndike’s notion of identical elements. Behaviors should be practiced in the exact situations in which they will be called for (e.g., in class).

Guthrie believed that responses do not need to be rewarded to be learned. Rather, learning requires close pairing in time between stimulus and response ( contiguity ). Guthrie ( 1952 ) disputed Thorndike’s Law of Effect because satisfiers and annoyers are effects of actions; therefore, they cannot influence learning of previous connections but only subsequent ones. Rewards might help to prevent  unlearning  (forgetting) because they prevent new responses from being associated with stimulus cues.

Contiguity is a central feature of school learning. Flashcards help students learn arithmetic facts. Students learn to associate a stimulus (e.g., 4 × 4) with a response (16). Foreign-language words are associated with their English equivalents, and chemical symbols are associated with their element names.

Habit Formation and Change

Guthrie’s ideas are relevant to habit formation and change.  Habits  are learned dispositions to repeat past responses (Wood & Neal,  2007 ). Because habits are behaviors established to many cues, teachers who want students to behave well in school should link school rules with many cues. “Treat others with respect,” needs to be linked with the classroom, computer lab, halls, cafeteria, gymnasium, auditorium, and playground. By applying this rule in each of these settings, students’ respectful behaviors toward others become habitual. If students believe they have to practice respect only in the classroom, respecting others will not become a habit.

The key to changing behavior is to “find the cues that initiate the action and to practice another response to these cues” (Guthrie,  1952 , p. 115). Guthrie identified three methods for altering habits: threshold, fatigue, and incompatible response ( Table 3.2  and  Application 3.4 ).

Table 3.2 Guthrie’s methods for breaking habits.

Method Explanation Example
Threshold Introduce weak stimulus. Increase stimulus, but keep it below threshold value that will produce unwanted response. Introduce academic content in short blocks of time for children. Gradually increase session length, but not to a point where students become frustrated or bored.
Fatigue Force child to make unwanted response repeatedly in presence of stimulus. Give child who makes paper airplanes a stack of paper and have child make each sheet into a plane.
Incompatible response In presence of stimulus, have child make response incompatible with unwanted response. Pair cues associated with media center with reading rather than talking.

APPLICATION 3.4 Breaking Habits

Guthrie’s contiguity principle offers practical suggestions for how to break habits. One application of the threshold method involves the time young children spend on academic activities. Many young children have short attention spans, which limit how long they can sustain work on one activity. Most class activities are scheduled to last no longer than 30–40 minutes. However, at the start of the school year, attention spans quickly wane for many children. To apply Guthrie’s theory, a teacher might, at the start of the year, limit activities to 15–20 minutes. Over the next few weeks the teacher could gradually increase the time students spend working on a single activity.

The threshold method also can be applied to teaching printing. When children first learn to form letters, their movements are awkward and they lack fine motor coordination. The distances between lines on a page are purposely wide so children can fit the letters into the space. If paper with narrower lines were initially introduced, students’ letters would spill over the borders and students might become frustrated. Once students can form letters within the wider lines, they can use paper with narrower lines to help them refine their skills.

Teachers need to be judicious when using the fatigue method because of potential negative consequences. Jason likes to make paper airplanes and sail them across the room. His teacher might remove him from the classroom, give him a large stack of paper, and tell him to start making paper airplanes. After Jason has made several airplanes, the activity should lose its attraction and paper will no longer be a cue for him to make airplanes.

Some students like to race around the gym when they first enter their physical education class. To employ the fatigue method, the physical education teacher might just let these students keep running after the class has begun. Soon they will tire and quit running.

The incompatible response method can be used with students who talk and misbehave in the media center. Reading is incompatible with talking. The media center teacher might ask the students to find interesting books and read them while in the center. Assuming that the students find the books enjoyable, the media center will, over time, become a cue for selecting and reading books rather than for talking with other students.

A social studies teacher has some students who regularly do not pay attention in class. The teacher realized that using many slides while lecturing was boring. Soon the teacher began to incorporate other elements into each lesson, such as experiments, video clips, and debates, in an attempt to involve students and raise their interest in the course.

In the  threshold  method, the cue (stimulus) for the habit to be changed (the undesired response) is introduced at such a weak level that it does not elicit the response; it is below the threshold level of the response. Gradually the stimulus is introduced at greater intensity until it is presented at full strength. Were the stimulus introduced at its greatest intensity, the response would be the behavior that is to be changed (the habit). For example, some children react to the taste of spinach by refusing to eat it. To alter this habit, parents might introduce spinach in small bites or mixed with a food that the child enjoys. Over time, the amount of spinach the child eats can be increased.

In the  fatigue  method, the cue for engaging in the behavior is transformed into a cue for avoiding it. Here the stimulus is introduced at full strength and the individual performs the undesired response until he or she becomes exhausted. The stimulus becomes a cue for not performing the response. To alter a child’s behavior of repeatedly throwing toys, parents might make the child throw toys until it is no longer fun (some limits are needed!).

In the  incompatible response  method, the cue for the undesired behavior is paired with a response incompatible with the undesired response; that is, the two responses cannot be performed simultaneously. The response to be paired with the cue must be more attractive to the individual than the undesired response. The stimulus becomes a cue for performing the alternate response. To stop snacking while watching TV, people should keep their hands busy (e.g., sew, paint, work puzzles). Over time, watching TV becomes a cue for engaging in an activity other than snacking. Systematic desensitization (described earlier) also makes use of incompatible responses.

Punishment is ineffective in altering habits (Guthrie,  1952 ). Punishment following a response cannot affect the stimulus–response association. Punishment given while a behavior is being performed may disrupt or suppress the habit but not change it. Punishment does not establish an alternate response to the stimulus. It is better to alter negative habits by replacing them with desirable ones (i.e., incompatible responses).

Guthrie’s theory does not include cognitive processes. Although it is not a viable learning theory today, its emphasis on contiguity is timely because current theories stress contiguity. Cognitive theories predict that learning requires understanding the relationship between a stimulus (situation, event) and the appropriate response. Guthrie’s ideas about changing habits provide general guidance for developing better habits.


A well-known behavior theory is  operant conditioning , formulated by B. F. (Burrhus Frederic) Skinner (1904–1990). Beginning in the 1930s, Skinner published a series of papers on laboratory studies with animals in which he identified the components of operant conditioning. He summarized this early work in his influential book, The Behavior of Organisms (Skinner,  1938 ).

Skinner applied his ideas to human functioning. Early in his career, he became interested in education and developed teaching machines and programmed instruction. The Technology of Teaching (Skinner,  1968 ) addresses instruction, motivation, discipline, and creativity. In 1948 he published Walden Two, which describes how behavioral principles can be applied to create a utopian society. Skinner ( 1971 ) addressed the problems of modern life and advocated applying a behavioral technology to the design of cultures in Beyond Freedom and Dignity. Skinner and others have applied operant conditioning principles to school learning and discipline, child development, language acquisition, social behaviors, mental illness, medical problems, substance abuse, and vocational training (DeGrandpre,  2000 ; Karoly & Harris,  1986 ; Morris,  2003 ).

As a young man, Skinner aspired to be a writer (Skinner,  1970 ):

·  I built a small study in the attic and set to work. The results were disastrous. I frittered away my time. I read aimlessly, built model ships, played the piano, listened to the newly-invented radio, contributed to the humorous column of a local paper but wrote almost nothing else, and thought about seeing a psychiatrist. (p. 6)

He became interested in psychology after reading Pavlov’s ( 1927 Conditioned Reflexes and Watson’s ( 1924  Behaviorism . His subsequent career had a profound impact on the psychology of learning.

Despite his admission that “I had failed as a writer because I had had nothing important to say” (Skinner,  1970 , p. 7), he was a prolific writer who channeled his literary aspirations into scientific writing that spanned six decades (Lattal,  1992 ). His dedication to his profession is evident in his giving an invited address at the American Psychological Association convention eight days before he died (Holland,  1992 ; Skinner,  1990 ). The association honored him with a special issue of its monthly journal, American Psychologist (American Psychological Association,  1992 ). Although his theory has been discredited by current learning theorists because it cannot adequately explain higher-order and complex forms of learning (Bargh & Ferguson,  2000 ), his influence continues as operant conditioning principles are commonly applied to enhance student learning and behavior (Morris,  2003 ). In the opening scenario, for example, Leo employs operant conditioning principles to address student misbehavior. Emily and Shayna, on the other hand, argue for the importance of cognitive factors.

Conceptual Framework

This section discusses the assumptions underlying operant conditioning, how it reflects a functional analysis of behavior, and the implications of the theory for the prediction and control of behavior. Operant conditioning theory is complex (Dragoi & Staddon,  1999 ); its principles most relevant to human learning are covered in this chapter.

Scientific Assumptions.

Pavlov viewed behavior as a manifestation of neurological functioning. Skinner ( 1938 ) did not deny this but believed a psychology of behavior can be understood without reference to neurological or other internal events.

He raised similar objections to the unobservable processes and entities proposed by cognitive views of learning (Overskeid,  2007 ).  Private events  are internal responses accessible only to the individual and can be studied through people’s verbal reports, which are forms of behavior (Skinner,  1953 ). Skinner did not deny the existence of attitudes, beliefs, opinions, desires, and other forms of self-knowledge (he, after all, had them), but rather qualified their role.

People do not experience consciousness or emotions but rather their bodies, and internal reactions are responses to internal stimuli (Skinner,  1987 ). A further problem with internal processes is that translating them into language is difficult, because language does not completely capture the dimensions of an internal experience (e.g., pain). Much of what is called “knowing” involves using language ( verbal behavior ). Thoughts are types of behavior that are brought about by other stimuli (environmental or private) and that give rise to responses (overt or covert). When private events are expressed as overt behaviors, their role in a functional analysis can be determined.

Functional Analysis of Behavior.

Skinner ( 1953 ) referred to his theory as a  functional analysis :

·  The external variables of which behavior is a function provide for what may be called a causal or functional analysis. We undertake to predict and control the behavior of the individual organism. This is our “dependent variable”—the effect for which we are to find the cause. Our “independent variables”—the causes of behavior—are the external conditions of which behavior is a function. Relations between the two—the “cause-and-effect relationships” in behavior—are the laws of a science. A synthesis of these laws expressed in quantitative terms yields a comprehensive picture of the organism as a behaving system. (p. 35)

Learning  is “the reassortment of responses in a complex situation”; conditioning refers to “the strengthening of behavior which results from reinforcement” (Skinner,  1953 , p. 65). There are two types of conditioning: Type S and Type R.  Type S  is Pavlovian conditioning, characterized by the pairing of the reinforcing (unconditioned) stimulus with another (conditioned) stimulus. The S calls attention to the importance of the stimulus in eliciting a response from the organism. The response made to the eliciting stimulus is known as  respondent behavior .

Although Type S conditioning may explain conditioned emotional reactions, most human behaviors are emitted in the presence of stimuli rather than automatically elicited by them. Responses are controlled by their consequences, not by antecedent stimuli. This type of behavior, which Skinner termed  Type R  to emphasize the response aspect, is  operant behavior  because it operates on the environment to produce an effect.

·  If the occurrence of an operant is followed by presentation of a reinforcing stimulus, the strength is increased…. If the occurrence of an operant already strengthened through conditioning is not followed by the reinforcing stimulus, the strength is decreased. (Skinner,  1938 , p. 21)

We might think of operant behavior as “learning by doing,” and in fact much learning occurs when we perform behaviors (Lesgold,  2001 ). Unlike respondent behavior, which prior to conditioning does not occur, the probability of occurrence of an operant is never zero because the response must be made for reinforcement to be provided. Reinforcement changes the likelihood or rate of occurrence of the response. Operant behaviors act upon their environments and become more or less likely to occur because of reinforcement.

Basic Processes

This section examines the basic processes in operant conditioning: reinforcement, extinction, primary and secondary reinforcers, the Premack Principle, punishment, schedules of reinforcement, generalization, and discrimination.


Reinforcement  is responsible for response strengthening—increasing the rate of responding or making responses more likely to occur. A reinforcer (or  reinforcing stimulus ) is any stimulus or event following a response that leads to response strengthening. Reinforcers are defined based on their effects, which do not depend upon mental processes such as consciousness, intentions, or goals (Schultz,  2006 ). Because reinforcers are defined by their effects, they cannot be determined in advance.

·  The only way to tell whether or not a given event is reinforcing to a given organism under given conditions is to make a direct test. We observe the frequency of a selected response, then make an event contingent upon it and observe any change in frequency. If there is a change, we classify the event as reinforcing to the organism under the existing conditions. (Skinner,  1953 , pp. 72–73)

Reinforcers are situationally specific: They apply to individuals at given times under given conditions. What is reinforcing to Maria during reading now may not be during mathematics now or reading later. Despite this specificity, stimuli or events that reinforce behavior can be predicted (Skinner,  1953 ). Students typically find reinforcing such events as teacher praise, free time, privileges, stickers, and high grades. Nonetheless, one never can know for certain whether a consequence is reinforcing until it is presented after a response and we see whether behavior changes.

The basic operant model of conditioning is the  three-term contingency :

SD → R → SR

 discriminative stimulus  (SD) sets the occasion for a response (R) to be emitted, which is followed by a  reinforcing stimulus  (SR, or  reinforcement ). The reinforcing stimulus is any stimulus (event, consequence) that increases the probability the response will be emitted in the future when the discriminative stimulus is present. In more familiar terms, we might label this the A-B-C model:

A (Antecedent) → B (Behavior) → C (Consequence)

Positive reinforcement involves presenting a stimulus, or adding something to a situation, following a response, which increases the future likelihood of that response occurring in that situation. A  positive reinforcer  is a stimulus that, when presented following a response, increases the future likelihood of the response occurring in that situation. In the opening scenario, Leo uses points as positive reinforcers for good behavior ( Table 3.3 ).

Negative reinforcement involves removing a stimulus, or taking something away from a situation following a response, which increases the future likelihood that the response will occur in that situation. A  negative reinforcer  is a stimulus that, when removed by a response, increases the future likelihood of the response occurring in that situation. Some stimuli that often function as negative reinforcers are bright lights, loud noises, criticism, annoying people, and low grades, because behaviors that remove them tend to be reinforcing. Positive and negative reinforcement have the same effect: They increase the likelihood that the response will be made in the future in the presence of the stimulus.

To illustrate these processes ( Table 3.3 ), assume that a teacher is holding a question-and-answer session with the class. The teacher asks a question (SD or A), calls on a student volunteer who gives the correct answer (R or B), and says to the student “That’s good” (SR or C). If volunteering by this student increases, saying “That’s good” is a positive reinforcer and this is an example of positive reinforcement because volunteering increased. Now assume that after a student gives the correct answer the teacher tells the student he or she does not need to do the homework. If volunteering by this student increases, homework is a negative reinforcer and this is an example of negative reinforcement because removing the homework increased volunteering.  Application 3.5  gives other examples of positive and negative reinforcement.

Table 3.3 Reinforcement and punishment processes.


Discriminative Stimulus




Reinforcing (Punishing) Stimulus

Positive Reinforcement (Present positive reinforcer)
T asks question S volunteers * T says to S, “That’s good”
Negative Reinforcement (Remove negative reinforcer)

Place this order or similar order and get an amazing discount. USE Discount code “GET20” for 20% discount