Learning 2 - Took this course with Professor Ming Chen. PDF

Title Learning 2 - Took this course with Professor Ming Chen.
Course Psychology of Learning
Institution The City College of New York
Pages 11
File Size 214.3 KB
File Type PDF
Total Downloads 18
Total Views 131

Summary

Took this course with Professor Ming Chen....


Description

Stimulus Generalization (Transfer of Learning) - Occurs in Classical Conditioning Stimuli highly similar to CS will also elicit CR in varying degrees. Albert learned to fear not only white rat, but similar objects: Rabbit, dog, fur coat, man’s beard. Stimulus Discrimination in Classical Conditiong Opposite of stimulus generalization Tendency to lack a conditioned response to a new stimulus that resembles the original conditioned stimulus. W. put his head down to see if Albert would play with his hair. Albert was completely negative. Two other observers did the same thing. He began immediately to play with their hair! Stimulus Discrimination Training Pavlov observed that when he conditioned the dog to salivate to high-pitch tone, the dog generalized the CR (salivation) to low-pitch tone (stimulus generalization). To teach the dog the difference between the two (stimulus discrimiation), Pavlov repeatedly presented the high-tone with meat powder and low-tone without meat powder. After several presentations, the dog learned to salivate only to high-tone. Extinction Extinguishing the conditioned response (CR) to conditioned stimulus (CS): When a CS (bell: light) is presented over a sufficiently long period of time without the UCS (food), the CS will eventually lose its ability to elicit the conditioned response (salivation). Extinction doesn’t always work because: 1. Occurs at unpredictable speed – if stimulus pairings (CS & UCS) during conditioning were inconsistent, extinction is apt to be slow. 2. Phenomenon of spontaneous recovery – even when a response has been extinguished, it may reappear. This is especially likely if extinction occurred in only one context. An Alternative Procedure to Extinction Counterconditioning One conditioned stimulus is replaced with a new one, more productive one. Counter Conditioning tends to be more effective than extinction in eliminating undesirable conditioned responses. It also decreases the chance that those responses will occur through “spontaneous recovery”. The response made to a conditioned stimulus (fear response learned for a C.S., such as white rats, rabbits, or snakes) is reversed or “countered by pairing this stimulus with a UCS that promotes the opposite type of reaction. Pearce & Dickson, 1975 Rat trained to press lever for food-reinforcement. Once trained, when the lever, a CS (tone: light is presented) and is ended with shock. So CS is paired with an aversive US (shock). Aversive properties of shock were reduced or reversed by pairing the shock with food.

These animals show less fear [extent to which tone/light elicits a freezing response (conditioned suppression-a reduction in the frequency of a conditioned response resulting from the presence of a stimulus previously associated with pain) that suppresses animal’s lever pressing] than control groups (who received shocks & food unpaired). Mary Cover Jones 1924 & “Little Peter” Peter was 2-yr-old who had acquired a fear of rabbits. Jones placed him in highchair and gave him some candy. As he ate, she brought a rabbit into the far side of the room. The pleasure Peter felt as he ate the candy was a stronger response than his anxiety and eventually overpowered his anxiety. She repeated the same procedure every day over a 2 mos period and brought the rabbit slightly closer each time, and Peter’s anxiety over rabbits eventually disappeared. Here is how the counter-conditioning worked: • A new response (happiness) that is incompatible with existing conditioned response (fear) is chosen. • A stimulus that already elicits the incompatible (happiness) response must be identified (candy) and must be one that has a stronger effect than the stimulus (rabbit) eliciting the undesirable response. • Then this stimulus (candy) is presented to the individual/organism and the CS (rabbit) that elicits the undesired CR (fear) is gradually introduced. How would you cure this person’s snake phobia using counterconditioning? CS (Snake) -----> CR (Fear) Counter-Conditioning - A new response (positive feeling) that is incompatible with existing conditioned response (fear) is chosen. A stimulus that already elicits the incompatible (positive feeling) is identified (strawberry ice cream). Pair: CS (Snake) + UCS (Strawberry Ice cream) Eventually fear of snakes is replaced by positive feeling: CS (Snake) ------> CR (Positive Feeling) Counter-Conditioning lies behind many of the procedures used in “behavior therapy." Counter conditioning is typically used to eliminate phobia. Watson’s plan for Albert: Pair a pleasant UCS (positive feeling) with a CS (rat). Instrumental Conditioning (Thorndike) Learning is based on the effects/consequences of one’s behavior in the environment, not on continuity. Procedure – placed animal in “puzzle box” (e.g., hungry cat w/fish outside box). Initially, the cat would try many random behaviors in an attempt to get out of the cage. Eventually, the cat would press the paddle by chance & door would open & cat could escape and get food. The cat would be placed in a box again and would on average take a long time to escape after exhibiting many different behaviors. Measure of Learning = Decreasing the amount of time it takes an animal to operate latch and escape.

“Trial and Error” Learning No evidence of “sudden insight” learning. Cats put back in the box and do not immediately press paddle; again does different behaviors and comes upon pressing. After many, many trials (e.g., 30) cats would press the paddle almost as soon as they were placed in the cage. So they were learning by ‘trial and error.” “Law of Effect” The consequences of a response determine whether the tendency to perform it is strengthened or weakened. If the response is followed by a satisfying event, it will be strengthened; if followed by an annoying (non-satisfying) event, it will be weakened. _____________________________________________________________________________________ Skinner - Operant Conditioning Consistent with and follows upon Thorndike’s Instrumental Conditioning in understanding behavior in terms of its RESULTS or CONSEQUENCES IN ENVIRONMENT. So learning is not a matter of S-R Associations. However, a more behaviorist impulse than Thorndike. Stick to observable, non-mental in explaining behavior. Thought Thorndike too subjective in Law of Effect appealing to concepts such as “satisfying/non-satisfying” state of affairs. “Black Box” Approach of Skinner Learner is a “black box” and nothing is known about what goes on inside the learner. But it is not necessary to know what goes on inside the learner because behavior is governed by its environmental antecedents & consequences. 3 Aspects of Operant Conditioning: [“S → R → S” or “A, B, Cs”] 1. (S) Discriminative Stimuli (Antecedents) – antecedents to behavior; setting conditions for its occurrence; a stimulus that provides information about what to do. Any stimulus that is consistently present when a response has been reinforced. a. Sets the occasion. b. Probability of a response maximized. Examples that exercise control over behavior (as a result of prior reinforcements in their presence) - Red and green traffic lights - Stop Signs - Verbal commands: “pass the salt” 2. (R) Operant Responses (Behavior) – behaviors freely emitted by organism (contrasted with the “respondent” behavior studied by Pavlov, which is behavior elicited involuntarily in reaction to a stimulus) 3. (S) Stimulus Consequence (Result) – in the environment brought about by the emitted response (reinforcement or punishment)

Skinner Methodology Placed animal (e.g., rat) in Skinner box. In a box, behavior is punished or rewarded after an animal makes a response. Strengthening Operant Behaviors – Increasing their Frequency of Occurrence 1. Positive Reinforcement – presentation of a reinforcer contingent upon a response. 2. Negative Reinforcement – removal of an aversive stimuli contingent upon a response; This is “escape” or “avoidance” conditioning because you learn to make a response that will enable escape/avoidance of aversive stimulus. Positive Reinforcement Worked by placing a hungry rat in his Skinner box. The box contained a lever in the side. As the rat moved about the box, it would accidentally knock the lever. Immediately, a food pellet would drop into a container next to the lever. The rats quickly learned to go straight to the lever after a few times of being put in a box. The consequence of getting food if they pressed the lever ensured they would repeat the action. Negative Reinforcement Place the rat in Skinner box and subject it to an unpleasant electric current which causes some discomfort. As the rat moved about the box, it would accidentally knock the lever. Immediately after it did so the electric current would be switched off. The rats quickly learned to go straight to the lever after a few times of being put in the box. The consequence of escaping the electric current ensured that they would repeat the action again. Types of Reinforcers Primary Reinforcer

Conditioned Reinforcer

Reinforcement value is biologically determined

Acquire reinforcement value through association w/primary reinforcer

FOOD SLEEP

MONEY (used to acquire primary reinforcers; paper bills paired w/ acquisition of food, etc.)

Relativity of Reinforcers – “Premack Principle” Behaviors which the learner already engages in to a high degree (Hi-Freq behaviors) may be used to reinforce low-frequency behaviors. (Premack Principle) Ex. You can watch TV (high – frequency behavior) as soon as you finish your homework (low-frequency behavior). Weakening Operant Behaviors – Reducing their Frequency of Occurrence 1. Punishment – present of an aversive consequence upon a response that reduces the rate of response. - Spanking a child.

- Drill sergeant ordering 20 more push-ups. 2. Reinforcement Removal – take away reinforcement when behavior occurs A. Extinction – when previously existing contingencies of reinforcement are taken away Ex. Teacher stops paying attention to the student, madly waving arms in the air. B. Time-Out – removal of the learner, for limited time, from circumstances reinforcing the undesired behavior Limitations of Punishment 1. Its effectiveness tends to be short-lived; punished behavior may cease but may not be forgotten. 2. Can lead, when severe, to undesirable emotional responses being conditioned. If fear is elicited, then avoidance/escape behavior may be negatively reinforced inadvertently. Ex. Truancy & running away. 3. Can serve as a model of aggression. - Bandura, Ross, & Ross ’61; ’63 – those who observed aggression more likely to be aggressive. - Parents who are abusive were themselves abused. 4. Punishment & “Learned Helplessness”: Long history of punishment, especially where aversive stimuli cannot be avoided, can lead to “learned helplessness.” Passive acceptance of events seemingly beyond one’s control. Seligman & Maier ’67 – 2 Phase Study 1st Phase – unpredictable & painful shocks were administered to dogs. For some dogs, escape was possible through a panel in the cage. For others, escape is not possible no matter what they did. 2nd Phase – the dogs placed in 1 of 2 compartments of a box, in which a tone sounded to warn of impending shock. Dogs could escape by jumping barriers into the second compartment. Dogs who had been allowed previously to escape shock quickly learned to jump behavior. The other dogs made little attempt to escape. Teaching New Behaviors Shaping – Reinforcement of successive approximations to goal behavior. Organisms do not have goal behavior already in repertoire. So behavior reinforced each time only approximates target behavior. Shaping is used to teach new behaviors that are relatively simple and continuous in nature. Ex. Teaching rat to press level or pigeon to turn in a circle. Chaining - Serves to establish complex behaviors made up of discrete, simpler behaviors already known to learners. Ex. Learning a new dance. Each dance step may be learned through shaping, but the steps are strung together in sequence through forward chaining ( start with 1st step and progressively add steps that follow until the entire dance can be done) or backward chaining (practice the last step and progressively add steps that precede it). Ex. Memorizing long prose passages – sequences added in succession until entire passage can be repeated without error.

Operant Conditioning in Classroom Classroom Management – “token economy” – use of tokens or points that children can earn for engaging in desired behavior. They can exchange these tokens for desired prizes. _____________________________________________________________________________________ Piaget Piagetian Enterprise - interested in general forms of thought; mental processes we all share; not interested in intellectual differences nor in emotion. 3 Background Influences on Piaet’s Theory 1. Biology • He started out in Biology. By age 15-18, he wrote a paper on mollusks (shellfish). So strong in this work, he was offered a position as Curator of Mollusk Collection at Geneva Natural History Museum. • Ph.D. Natural Sciences - Biology. Dissertation on mollusks. Development of Cognition is Similar to Development of Biological Systems. Uses concepts that describe biological functioning & development to describe cognitive functioning & development: Growth, Stages, Equilibrium Adaptation: assimilation, accommodation. Not to Confuse Biological Influences on Theory with theory as a Biological Account of Cognitive Development • Piaget’s is an INTERACTIONIST Theory. • COGNITIVE Development results/proceeds from interaction between organism & environment. • Cognitive Stages are NOT the result of biological maturation (as Freud’s stages are). 2. Philosophy- Studied Later in Adolescence LOGIC Background in logic. He thought abstract logic was related to children’s thinking. • He noticed that children younger than about 11 years were unable to carry out certain elementary logical operations. Wanted to investigate such differences. • Moreover, he felt that thought processes form an integrated structure, whose basic properties can be described in logical terms. So, GOAL – to discuss how closely thought approximates logic. Very distinctive conception of the psychology of intelligence. Logic describes thought. 3. His work in BINET’S LAB in 1920s on developing IQ tests in Paris (to develop French version of English reasoning tasks). Upshots: 1. Found mistakes to be common and distinctly different at different ages -> qualitative changes in thought. 2. Development of a Clinical Method to follow a child's reasoning cause IQ tests prevented such.

Structural Aspects of Piaget’s Theory: Stages of Cognitive Development Sensorimotor

Birth - 2 Years

Preoperational

2 - 7 Years

Concrete Operational

7 - 11 Years

Formal Operational

11 Years and Older

From Structure to the Processes of Intelligence: The Functional Invariants Intelligence operates like a biological systems in exhibiting 2 invariant functions: Organization & Adaptation Organization Cognition, like digestion, is an organized affair. Organization refers to the fact that our psychological/cognitive structures are organized into coherent systems. Refers to the internal rearranging and linking of cognitive structures. All intellectual organizations can be conceived of as totalities, systems of relationships among elements. The specific characteristics of this organization differ markedly by stage. Organization promotes adaptation to our environment. Adaptation Cognition as a kind of Biological Adaptation. Like other forms of biological adaptation, cognition elicits 2 simultaneous, complementary processes: Assimilation & Accommodation Adaptation entails building action schemes and cognitive structures through direct interaction with the environment. Assimilation & Accommodation A biological example: Ingestion and Digestion of Food Exp: Hawk’s sharp beak and amazing eyes. Human’s wisdom teeth. Organisms simultaneously accommodate to the particular structure of the food (chew hard or easy) and assimilate the food to their physical structures (transform its appearance; convert it into energy). Assimilation Using current schemes & COG structures to interpret the world. Interpreting or constructing external objects or events in terms of one’s own present way of thinking about things. External physical input is transformed and changed - assimilated to fit within existing mental structures.

Cognitive structures do not change during assimilation. Rather, the experiences encountered are altered to fit existing psychological structures. Assimilation used during Equilibrium. Examples of Assimilation Generalize Action Schemes: Baby who sucks everything, even non-suckable things, such as hard objects, is using an available action scheme (sucking) and assimilating all objects to it. Vocabulary Overextension: Toddlers who overextend a familiar word for known entity to refer to something new or different (e.g., use the word “dog” to refer to cow) are using their current concept (of dog) to understand the world. Symbolic Play: Young child who pretends that a chip of wood is a boat is “assimilating” wood chips to his boat concept. Accommodation/Learning Noticing and taking account of the various real properties & relationships among properties that external objects & events have. In accommodation, new cognitive structures are created or old ones are adjusted, upon noticing aspects of the environment that current cognitive structures do not fit. So, Prompted by disequilibrium. Accommodation represents the accommodation of mental structures to fit the input/environment. So accommodation is LEARNING. Example of Accomodation/Learning Imitation: The child’s adjusting actions and creating new ones in imitating his father’s gestures would be a classic case of accommodation. C is “accommodating” his mental framework (and those motor gestures) to fine detail of his father’s behavior. “Functional” Aspects of Piaget’s Theory: Assimilation & Accommodation Thus in any cognitive encounter with environment, assimilation & accommodation operate and are of equal importance. These processes are interdependent, simultaneously occurring. In interaction of internal cognitive systems with external environments (in construction of Knowledge), what you already know will greatly shape and constrain what environmental input you can detect and process (assimilation) just as what you can detect and process will provide essential grist for the generation of new knowledge (accommodation). With development, greater balance or equilibrium is achieved between assimilation and accommodation.

What are the MECHANISMS that provide the constructions from one stage to another? 1. Interaction/Experience with/Action in the Physical Environment helps propel development. Interaction with the environment leads to 3 different forms of Knowledge obtained through different forms of Abstraction. Physical Knowledge is Obtained through Empirical Abstraction Physical experience leads to physical knowledge (knowledge of observables - the properties and characteristics of objects) through a process of empirical (or formerly “simple”) abstraction. Ex. C lifts a block and notices it is heavy. Logico Mathematical Knowledge is Gained through Reflective Abstraction Logico Mathematical experience involves knowledge acquired from reflecting on one’s own actions, not from the objects themselves. Reflective Abstraction Logico-Mathematical Experience C gains physical knowledge: observes 2 sets of objects and perceives that each element is a circle and one set is arranged as a line, and the other, a circle. 3rd Kind of Abstraction: Pseudoempirical Abstraction This type of abstraction is a primitive form of reflective abstraction that occurs during the early part of the concrete operational stage. Hence it is found in the initial stages of the formation of logico mathematical knowledge, when the C needs to use concrete objects as a support for such knowledge. Ex. The counting of pebbles. Here the knowledge is not abstracted from the pebbles and thus is not physical experience, but is attributed t...


Similar Free PDFs