CH 4: SELECTION IN THE EXPERIENCED LEARNER
1. In general, how do the issues dealt with in this chapter differ from those of earlier chapters? What is the potential importance of these issues?
2. Explain the following: "As the complexity of behavior increases, experimental analysis must be supplemented by scientific interpretation."
3. Describe an extinction procedure? What is the effect of this procedure?
4. What is intermittent reinforcement, and what is its effect on the acquisition and retention of environment-behavior relations? (Refer to Figure 4.1.)
Effects of Intermittent Reinforcement on Extinction
5. In general, how does intermittent reinforcement increase resistance to extinction (persistence) of behavior?
6. Identify some possible stimuli that are common between acquisition with intermittent reinforcement and extinction.
7. Under what circumstances is the following statement likely to be correct? "When the going gets tough, the tough get going."
8. Be able to "read" cumulative records such as those shown in Figure 4.2. What does this record tell you about the effects of intermittent reinforcement on extinction in an operant procedure? Can cumulative records ever decrease? Explain.
Reacquisition of Extinguished Relations
9. What are the experimental findings regarding reacquisition after extinction? How does the rate of reacquisition compare with the rate of acquisition?
10. Interpret the above findings using information from both the behavioral level and simulations of neural levels of analysis (see Figure 4.3).
Durability of Selected Relations
11. What is the experimental evidence concerning how long the effects of selection are retained?
12. If the effects of selection can be retained indefinitely, then why do we seem to forget? (Information in later chapters will enable you to answer this question in more detail.)
13. Why is it important for the selection of complex behavior that reinforcing stimuli not be restricted to naturally selected reinforcing stimuli?
14. What are the technical terms for the stimuli that function as elicitors as the result of behavioral selection (i.e., selection by reinforcement)?
Acquired Elicitors as Reinforcers
15. Be able to describe the procedures and results that indicate that acquired reinforcers can select environment-behavior relations in both the classical and operant procedures.
Acquired Reinforcers and the Reinforcement Principle
16. What methodological difficulty is encountered when attempts are made to study acquired reinforcers over prolonged periods of time?
17. How is this difficulty overcome (at least in part), and what are the behavioral findings regarding the conditions that required for selection by acquired reinforcers? How do these conditions compare with those that must be met for reinforcement by naturally selected reinforcers? (Hint: S-reinforcer contiguity, R-reinforcer contiguity, and discrepancy)
18. Regarding the physiological findings: (a) Why can the VTA-frontal lobe system not account for acquired reinforcement by itself? (b) How is acquired reinforcement accounted for on the neural level? (See Figure 4.5.)
19. Describe two pieces of experimental evidence that are consistent with the account of acquired reinforcement on the neural level. Be able to explain how these findings support the account.
20. Comment on the following statement: Acquired reinforcers free the learner from the effects of the environment since the organism can now reinforce itself.
Effects of Acquired Reinforcers on Temporal Contiguity
21. After reading this section, be able to explain what is meant by environmental chaining and internal reinforcement, and the importance of each.
Serial compound conditioning
22. Describe the serial compound-conditioning procedure, major findings with the procedure, the interpretation of these findings, and their significance.
Effects of Acquired Reinforcers on Selection of Complex Behavior
23. Describe a study which indicates that human behavior can be selected by acquired reinforcers.
24. Carefully read the discussion about paychecks as reinforcers for human behavior. What is the major point of this example?
25. Comment on the following statement: Other species require immediate reinforcement for their behavior, but our behavior can be affected by more remote consequences.
26. Be able to describe the procedure and findings from research on self-control. Relate this research to the previous question.
27. What is behavioral chaining, and how does it differ from environmental chaining studied with the serial compound-conditioning procedure? What reinforces a response in a behavioral chain? Why does behavioral chaining often produce behavior that occurs in a particular sequential order?
Complex Effects of Reinforcement History
Behavioral Discrepancies in Experienced Learners
28. Review the effects of behavioral discrepancies on the selection and extinction of environment-behavior relations.
29. What is over-expectation? What is the procedure for producing it, and how can it be interpreted?
30. Comment on the statement that different learners are differently affected by the "same" reinforcer in the light of the over-expectation effect. What are the implications of this, and other findings, for the application of biobehavioral principles to human betterment? Can applied behavior analysis ever become an "exact science"? Is this situation different from that in other applied sciences such as engineering or medicine? Explain.
Discriminative Function of Reinforcer-Elicited Responses
31. Review why reinforced-elicited responses are acquired before operant responses when behavioral selection occurs in an operant procedure.
32. Indicate why stimuli produced by conditioned responses are in a position to guide the operant. What is the role of these stimuli in relation to the guidance of behavior by environmental discriminative stimuli? Explain using the phrase compound discriminative stimulus in your answer.
33. Answer the following after reading the section on devaluation: (a) In general, describe the processes that permit pairing a reinforcer with poison to reduce the strength of a response when its discriminative stimulus is presented? (b) Indicate how the experiment described in the reading (see Figure 4.11) meets these conditions.
34. In trying to eliminate unreasonable fears of stimuli, i.e., phobias, is it enough to extinguish the operants that these stimuli evoke? Explain.
Complex Contingencies of Reinforcement
Schedule of Reinforcement
35. Technically, what is meant by the phrase, schedule of reinforcement?
36. After reading the section describing some of the "standard" schedules of reinforcement, select one of the schedules and give an interpretation of the processes that cause the schedule to produce the pattern of behavior shown in Figure 4.12. (Hint: Under what conditions is the rate of responding high; under what conditions is it low. Do not use mentalistic notions such as the animal "anticipates" the food.
37. What is a concurrent schedule? Give an example of such a schedule encountered in daily living.
38. What is the matching principle? Indicate how this principle is illustrated by the findings shown in Figure 4.13.
39. What is a molar principle, and in what way does the matching principle illustrate such a principle?
Molar and Molecular Accounts of Selection
40. Indicate how the principle of natural selection can be sensitive to molar variables but the principle of behavior selection (selection by reinforcement) cannot—at least directly.
41. Explain what is means to say that the cumulative effects of selection by reinforcement could be described by a molar principle but the effect of a single selection could not. Include in your answer, comments indicating the relation between the principles of natural and behavioral selection.
42. How do the results from the experiment in which there was no history of selection for switching behavior prior to the "choice" test relate to the molar-molecular issue? In nontechnical terms, did the animal's "knowledge" that the different disk colors were associated with different reinforcement frequencies produce matching?
Contingencies with Aversive Stimuli
43. What is the difference between an appetitive and an aversive elicitor? Give examples to illustrate your answer.
44. Under what conditions will an eliciting stimulus function as a punisher? Your answer should make clear the definitions of reinforcers and punishers.
45. Under what conditions can an aversive elicitor function as a reinforcer?
46. What are the implications of this study for the view that reinforcers are stimuli that the learner "likes" and punishers are stimuli that the learner "dislikes"? Most generally, what are the implications of these studies for the view that reinforcement theory is simply a modern form of the older philosophical position of hedonism?
47. Describe the defining characteristics of a shaping procedure. Is shaping a procedure that is only important for training animals? Explain.
48. Describe the procedure and findings from the experimental study of shaping. Must the shaped environment-behavior relation be within the repertoire of the learner from the outset of training? Explain.
49. Does shaping always require a "shaper"? Explain.
50. Why is it said that reinforcers, whether used in a shaping procedure or not, select a class of responses that are guided by a class of stimuli? Use the term response topography in your answer.