Chance 7e TB Ch07 - Chapter 7 Test bank PDF

Title Chance 7e TB Ch07 - Chapter 7 Test bank
Course Introduction To Learning And Behavioral Analysis
Institution Queens College CUNY
Pages 8
File Size 133.1 KB
File Type PDF
Total Downloads 24
Total Views 151

Summary

Chapter 7 Test bank...


Description

Cha pt e r7:Sc he dul e so fRe i nf or c e me nt Multiple Choice 1. A given reinforcement schedule tends to produce a distinctive pattern and rate of performance. These are called schedule _______. a. patterns b. profiles c. effects d. matrixes Ans: C Ref: 194 2. John spent his summer picking cantaloupes for a farmer. The farmer paid John a certain amount for every basket of cantaloupes picked. John worked on a _________. a. fixed ratio schedule b. variable ratio schedule c. fixed interval schedule d. variable interval schedule Ans: A Ref: 195 3. The schedule to use if you want to produce the most rapid learning of new behavior is _______. a. CRF b. FR 2 c. FI 3” d. VI 3” Ans: A Ref: 195 4. The schedule that is not an intermittent schedule is _________. a. FR 1 b. FR 5 c. VR 1 d. VI 1" Ans: A Ref: 196 5. A reduction in response rate following reinforcement is called a _________. a. post-reinforcement pause b. scallop c. latency d. rest stop Ans: A Ref: 196 6. CRF is synonymous with _________. a. EXT b. FR 1 c. CRT d. FI 1 Ans: B Ref: 196

TESTBANKCHAPTER7

4 6

7. The rate at which a response occurs, once the subject begins performing it, is called the _________. a. clock rate b. walk rate c. run rate d. performance rate Ans: C Ref: 197 8. Post-reinforcement pauses are now often referred to as _________. a. rest periods b. pre-ratio pauses c. anticipatory pauses d. break points Ans: B Ref: 197 9. Bill spends his summer in the city panhandling. Every day he takes a position on a busy corner and accosts passersby saying, “Can you spare some change?” Most people ignore him, but every now and then someone gives him money. Bill’s reinforcement schedule is best described as a _________. a. fixed ratio schedule b. variable ratio schedule c. fixed interval schedule d. variable interval schedule Ans: B Ref: 198 10. Your text reports the case of a man who apparently made hundreds of harassing phone calls. The man’s behavior was most likley on a(n) _________. a. FR schedule b. VR schedule c. FI schedule d. VI schedule Ans: B Ref: 200 11. The schedule that is likely to produce a cumulative record with scallops is the _________. a. FR schedule b. VR schedule c. FI schedule d. VI schedule Ans: C Ref: 200 12. A pigeon is confronted with two disks, one green, the other red. The bird receives food on a VI 20" schedule when it pecks the green disk, and on a VI 10" schedule when it pecks the red one. You predict that the bird will peck _________. a. one disk about as often as at the other b. the green disk almost exclusively c. the green disk about twice as often as the red disk d. the red disk about twice as often as the green disk Ans: D Ref: 203

TESTBANKCHAPTER7

4 7

13. Often the initial effect of an extinction procedure is an increase in the behavior called a(n) extinction ________. a. rebound b. resurgence c. burst d. flyer Ans: C Ref: 204 14. Resurgence may help account for _______. a. PMS b. rationalization c. regression d. reaction formation Ans: C Ref: 205 15. The reappearance of previously effective behavior during extinction is called ____________. a. spontaneous recovery b. recovery c. resurgence d. fulfillment Ans: C Ref: 205 16. Williams found that the greater the number of reinforcements before extinction, the _______. a. greater the number of responses during extinction b. faster the rate of extinction c. stronger the response during extinction d. greater the frustration during extinction Ans: A Ref: 207 17. In a _____ schedule, reinforcement is contingent on the continuous performance of a behavior for some period of time. a. fixed duration b. continuous reinforcement c. fixed time d. DRH Ans: A Ref: 208 18. When reinforcement is contingent on continuous performance of an activity, a __________. reinforcement schedule is in force. a. duration b. interval c. time d. ratio Ans: A Ref: 208 19. In schedules research, VD stands for ________. a. video displayed b. verbal dependent c. variable dependency d. variable duration Ans: D Ref: 208

TESTBANKCHAPTER7

4 8

20. Of the following, the schedule that most closely resembles noncontingent reinforcement is _________. a. FD b. FT c. FI d. DRL Ans: B Ref: 209 21. A schedule that does not require the performance of a particular behavior is the _________. a. FT schedule b. FD schedule c. FI schedule d. FR schedule Ans: A Ref: 209 22. Harry spent his summer in the city panhandling. Every day he would sit on the sidewalk, put a cardboard sign in front of him that said, "Please help," and place his hat on the sidewalk upside down. Then he would wait. Every now and then someone would put money into his hat. Harry's reinforcement schedule is best described as a _________. a. fixed ratio schedule b. variable ratio schedule c. fixed interval schedule d. variable time schedule Ans: D Ref: 209 23. FT and VT are both kinds of ______ reinforcement. a. noncontingent b. intermittent c. duration-based d. continuous Ans: A Ref: 209 Note: It might be argued that FT and VT are intermittent schedules, but since the reinforcers are not contingent on a behavior, that term seems inappropriate. 24. ___________ schedules differ from other schedules in that the rules describing the contingencies change systematically. a. Adaptive b. Evolutionary c. Progressive d. Idiosyncratic Ans: C Ref: 209 25. __________ refers to the point at which a behavior stops or its rate falls off sharply. a. Block b. Border time c. Break point d. Camel’s back Ans: C Ref: 210

TESTBANKCHAPTER7

4 9

26. George trains a pigeon to peck a disk by reinforcing each disk peck. Once the response is learned, George begins to cut back on the reinforcers. At first he reinforces every other response, then every third response, every fifth response, every tenth response, and so on. George is using a procedure called _________. a. ratio tuning b. reinforcement decimation c. intermittent reinforcement d. stretching the ratio Ans: D Ref: 211 27. Things are going pretty well for George (see item 26) until he jumps from reinforcing every tenth response to reinforcing every 50th response. At this point, the pigeon responds erratically and nearly stops responding entirely. George's pigeon is suffering from _________. a. ratio strain b. ratiocination c. satiation d. reinforcer deprivation Ans: A Ref: 211 28. Gradually reducing the frequency of reinforcement is called _________. a. extinction b. stretching the ratio c. intermittent reinforcement d. progressional reinforcement Ans: B Ref: 211 29. Shirley trains a rat to press a lever and then reinforces lever presses on an FR 10 schedule when a red light is on, and an FI 10" schedule when a green light is on. In this case, lever pressing is on a _________. a. multiple schedule b. chain schedule c. concurrent schedule d. redundant schedule Ans: A Ref: 212 30. A schedule in which reinforcement is contingent on the behavior of more than one subject is a _________. a. multiple schedule b. mixed schedule c. tandem schedule d. cooperative schedule Ans: D Ref: 213 31. A chain schedule is most like a _________ schedule. a. multiple b. mixed c. cooperative d. tandem Ans: D Ref: 213

TESTBANKCHAPTER7

5 0

32. Stanley wants to determine which of two reinforcement schedules is more attractive to rats. He trains a rat to press a lever for food, and then puts the rat into an experimental chamber containing two levers. Pressing one lever produces reinforcement on an FR 10 schedule; pressing the other lever produces reinforcement on an FI 10" schedule. Lever pressing is on a _________. a. multiple schedule b. chain schedule c. concurrent schedule d. redundant schedule Ans: C Ref: 214 33. Studies of choice involve _________. a. multiple schedules b. chain schedules c. concurrent schedules d. redundant schedules Ans: C Ref: 214 34. The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis. a. discrimination b. frustration c. sequential d. response unit Ans: B Ref: 217 35. One explanation for the PRE implies that the effect is really an illusion. This is the _________. a. discrimination hypothesis b. frustration hypothesis c. sequential hypothesis d. response unit hypothesis Ans: D Ref: 218 36. _____________ led the way in the study of choice. a. Richard Herrnstein b. Clark Hull c. B. F. Skinner d. E. L. Thorndike Ans: A Ref: 221 37. In one form of the matching law, BA stands for the behavior under consideration and B0 represents _______. a. reinforcement for BA b. the baseline rate of BA c. all behaviors other than BA d. all behavior that is over expectation Ans: C Ref: 222

TESTBANKCHAPTER7

5 1

38. The study of reinforcement schedules suggests that the behavior we call stick-to-itiveness is largely the product of _________. a. genetics b. prenatal influences c. choice d. reinforcement history Ans: D Ref: 225 39. A classic work on reinforcement schedules is by _________ . a. Darwin b. Herrnstein c. Ferster and Skinner d. Abercrombie and Fitch Ans: C Ref: 226 True/False 40. Although important, the matching law is restricted to a narrow range of species, responses, reinforcers, and reinforcement schedules. F (224) 41. In VI schedules, the reinforcer occurs periodically regardless of what the organism does. F (203) 42. One everyday example of a VR schedule is the lottery. T (198, inferred) 43. In a multiple schedule, the organism is forced to choose between two or more reinforcement schedules. F (212) 44. When a response is placed on extinction, there is often an increase in emotional behavior. T (205) 45. When food is the reinforcer, it is possible to stretch the ratio to the point at which an animal expends more energy than it receives. T (211) 46. One difference between FT and FI schedules is that in FT schedules, reinforcement is not contingent on a behavior. T (209) 47. The thinner of two schedules, VR 5 and VR 10, is VR 10. T (198) 48. Harlan Lane and Paul Shinkman put a college student’s behavior on extinction following VI reinforcement . The student performed the behavior 8,000 times without reinforcement. T (215) 49. The response unit hypothesis suggests that there really is no such thing as the partial reinforcement effect. T (219) 50. One effect of the extinction procedure is an increase in the variability of behavior. T (205) 51. The more effort a behavior requires, the fewer times the behavior will be performed during extinction. T (207) 52. Extinction often increases the frequency of emotional behavior. T (205) 53. Extinction often increases the variability of behavior. T (205) Completion 54. The rule describing the delivery of reinforcement is called a ________of reinforcement. Ans: schedule (194) 55. CRF stands for ________. Ans: continuous reinforcement (195) 56. The explanation of the PRE that puts greatest emphasis on internal cues is the ________ hypothesis. Ans: frustration (217) 57. In CRF, the ratio of reinforcers to responses is 1 to 1; in FR 1, the ratio is _______. Ans: 1 to 1 (196) 58. Choice involves ________ schedules. Ans: concurrent (214) 59. When behavior is on a FR schedule, animals often discontinue working briefly following reinforcement. These periods are called ________. Ans: post-reinforcement pauses/pre-ratio pauses/between-ratio pauses (196) 60. The term ________ refers to the pattern and rate of performance produced by a particular reinforcement schedule. Ans: schedule effects (194)

TESTBANKCHAPTER7

5 2

61. If you increase the requirements for reinforcement too quickly you are likely to see evidence of ratio _______. Ans: strain (211) 62. According to the ________ hypothesis, the PRE occurs because it is difficult to distinguish between intermittent reinforcment and extinction. Ans: discrimination (216) 63. The immediate effect of extinction is often an abrupt increase in the rate of the behavior being extinguished. This is called an extinction ______. Ans: burst (204) Short Essay 64. Explain why fatigue is not a good explanation for postreinforcement pauses. (197) Answers should note that more demanding (fatiguing) schedules do not necessarily produce longer pauses than less demanding schedules. Students might also argue that the fatigue explanation is circular. 65. A teacher has a student who gives up at the first sign of difficulty. How can the teacher increase the child’s persistence? (210) This is essentially the same question as review question 4. Answers should make use of stretching the ratio. 66. In a tandem schedule, behavior is performed during a series of schedules, but food reinforcement comes only at the end of the last schedule. What reinforces behavior during the earlier schedules when food is not provided? (213) It could be argued that the food reinforcement reinforces all performances. However, students should mention that each change from one schedule to the next brings the subject closer to food reinforcement and may therefore be reinforcing. 67. A rat’s lever pressing is on a concurrent VI 5” VI 15” schedule. Describe the rat’s behavior. (221) Students should indicate that for every lever press on the VI 15” schedule, there will be about three responses on the VI 5” schedule. 68. How might you use what you know about reinforcement schedules to study the effects of air temperature on behavior? (inferred, but see 225; figure 7-12) Answers should indicate an understanding of the value of schedule-induced steady rates to study the effects of independent variables, such as air temperature. 69. Why is the study of reinforcement schedules important? (225) This is review question 16, rephrased. Students might discuss the use of schedules in defining and studying personality characteristics such as laziness, the effects of drugs and other variables on behavior, and their use in studying extinction effects and other basic phenomena.

TESTBANKCHAPTER7

5 3...


Similar Free PDFs