ALL Weeks PDF

Title ALL Weeks
Course Language and Cognition
Institution University College London
Pages 88
File Size 4.3 MB
File Type PDF
Total Downloads 45
Total Views 369

Summary

Language & CognitionWeek 1: Brief intro and “What is special about language”What is language?What is special about language?How is meaning represented?Why study language and cognition?● Language as a window into the mind: relates to learning, memory, perception, action, social behaviour. Linguis...


Description

Language & Cognition Week 1: Brief intro and “What is special about language” What is language? What is special about language? How is meaning represented? Why study language and cognition? ● Language as a window into the mind: relates to learning, memory, perception, action, social behaviour. Linguistic measures and instructions very frequently used in numerous areas of psychology. ● Ofen argued to be unique among human cognitive processes; - nothing else is processed quite the same. Levels of representation (words, sentences, phrases) they are processed in very different kind of ways. Language processing is a rich area. A lot of experimental psychology to do with language. ● Ofen (implicitly or explicitly) considered as modular, with clearly different levels of processing, and a very wide range of methods and theories at various levels of enquiry; ● Argued to be uniquely human: what makes us special? ● But despite our huge expertise in using language, it is a great challenge to understand how language actually works & how it relates to other aspects of cognition. 1. We use language nearly all the time; technology and our cultures would be impossible without it. 2. We usually think in language. 3. Some people have difficulty learning spoken or written language (developmental disorders), or have difficulty with language as a consequence of brain damage (acquired disorders). Primary function of language is communication, however, it might have acquired (or even originated from) other functions. Non-linguistic, cognitive processes. Extreme version of this idea is that the form of our language shapes our perception and cognitive view (Sapir-Whorf hypothesis). Some have argued that language evolved to allow us to think and communication turned out to be a useful side effect.

Language input is extremely complex We are constantly getting all this information through language: - all (?) the individual words a speaker or writer produced; - a speaker’s intended meaning (words, phrases, sentences & more); - extra details (depending on language) like time, gender, number; - broader implications of what they say; - impressions of a speaker’s mental state; - linking their comments with our own experiences; - speaker-specific information based on accent, word choice, etc.; We even get meta-linguistic information about others’ intentions (why they are saying what they’re saying)

But there is also something easy about doing it We can be successful despite all these: - very large vocabulary from which words are chosen; - complex syntactic structures; - metaphors and other creative uses; - noisy environments; - dysfluencies (um, er); - all sorts of potential ambiguity in the language; - variation in speaker accent, dialect, other individual differences; - different perspectives, knowledge and experiences; We can also produce language, without having to think too hard about all these details, and in conversations, while understanding others.

How does it all work? Introspection reveals important aspects of language and is key to many approaches in Linguistics: e.g., which of these sentences is grammatical in English? The baker gave the bread to the psychologist. The baker ate the bread to the psychologist. The baker gave the bread. The baker ate the bread. And in Psychology: e.g., what is the meaning of words like bread, psychologist, language? But we need other methods to understand the cognitive processes and neural systems involved.

General questions that will apply throughout Language: ● What are the processes involved in producing and understanding language? ● Language is ofen described as manipulation of abstract symbols. What are they? ● Do processes in language operate independently or interact? And cognition: ● Are language processes specific to language or are they aspects of general cognitive processing? ● What general cognitive faculties are essential to language? ● How do language processes relate to other cognitive processes... and vice versa? ● What evidence can we use to inform us about language and cognition? ●Many studies use highly contrived methods that may not resemble language as we experience it naturally. How do these departures from ecological validity affect the conclusions we can draw from these studies? See Harley (2014) pp. 22-26

whether or not processes in language operate independently of one another, or whether they interact. This is the issue of modularity how sensitive are the results of our experiments to the particular techniques employed? experimental techniques themselves come under close scrutiny. In this respect, the distinction between data and theory can become very blurred

How modular is the language system? The concept of modularity is an important one in psycholinguistics. Most researchers agree that psychological processing can be best described in terms of a number of levels. Processing begins with an input that is acted on by one or more intervening levels of processing to produce an output. A module is a self-contained set of pro- cesses: it converts an input to an output, without any outside help for what goes on in between— we say that the processes inside a module are independent of processes outside the module. Yet another way of describing it is to say that processing is purely data-driven. Models in which processing occurs in this way are called autonomous. The opposing view is that processing is interactive. Interaction involves the influence of one level of processing on the operation of another, but there are two intertwined notions involved. First, there is the question of overlap of processing between stages. In a discrete stage model, a level of processing can only begin its work when the previous one has finished its own work. In a cascade model, information is allowed to flow from one level to the following level before it has completed its processing (McClelland, 1979). If the stages overlap, then multiple candidates might become activated at the lower level of processing. The second aspect of interaction is whether there is a reverse flow of information, or feedback, when information from a lower level feeds back to the prior level. For example, does knowledge about what a word might be influence the recognition of its component sounds or letters? Does the context of the sentence help to make identifying the constituent words easier? There is scope for confusion with the terms “bottom-up” and “top- down,” as they depend on the direction of processing. So a non-interactive model of word recognition would be one that is purely bottom-up—from the perceptual representation of the word to the mental representation—but a non-interactive model of word production would be one that is purely top-down— from the mental representation to the sound of the word.

“Data-driven” is a better term than “bottom- up,”. The important point is that models that permit feedback have both bottom-up and top-down information flow. Fodor (1983) argued that many psychological processes are modular. we should start with the assumption that processes are modular or non-interactive unless there is a very good reason to think otherwise. There are two main reasons for this assumption. First, modular models are generally simpler—they involve fewer processes and connections between systems. Second, it is widely believed that evolution favors a modular system. It is always possible to come up with a saving or auxiliary hypothesis that can be used to modify and hence save the modularity hypothesis (Lakatos, 1970). In theories of word recognition researchers have introduced the idea of post-access processes; in syntax and parsing they have proposed parallel processing with deferred decision making; and in word production they have proposed an editor, or emphasized the role of working memory, Both Fodor (1983, 1985) and Pinker (1994), who are leading exponents of the view that language is highly modular and has a significant innate basis, give a broader philosophical view: modularity is inconsistent with relativism, the idea that everything is relative to everything else and that anything goes (particularly in the social sciences). Modules provide a fixed framework in which to study the mind. neuropsychological dissociation between two processes is often taken as evidence of the modularity of the processes involved. - physical modularity (are psychological processes localized in one part of the brain?) and - processing modularity (in principle a set of processes might be distributed across the brain yet have a modular role in the processing model). Farah (1994) criticized this “locality” assumption and argued that neuropsychological dissociations were explicable in terms of distributed, connectionist systems. Chomsky (1975) argued that language is a special faculty that cannot be reduced to cognitive processes. On the other, Piaget (1923) argued that language is a cognitive process just like any other, and that linguistic development depends on general cognitive development. language plays a central role in our working memory, the short-term repository of information. whether the auxiliary hypothesis is more plausible than the non-modular alternative. You also need to think about whether data converges from experimental and imaging sources.

Is any part of language innate? researchers most committed to the claim that language processes are highly modular also argue that a significant amount of our language abilities are innate. clean-cut modules must be built into the brain, or hard-wired, and therefore innately programmed, and that complex, messy systems reflect the effects of learning. extent to which the innate components are only found in humans. Connectionist modeling suggests ways in which general properties of the learning system can serve the role of innate, languagespecific knowledge, and shows how behavior emerges from the interaction of nature and nurture at all levels (Elman et al., 1996).

Does the language system make use of rules? In traditional linguistics, much knowledge is encapsulated in the form of explicit rules. syntax of language in terms of rules such as “a sentence can comprise a noun phrase followed by a verb phrase.” can formulate a rule that the plural of a noun is formed by adding an “-s” to its end, except in a limited number of irregular forms, which we would need to store separately.

Connectionism has revolutionized psycholinguistics over the last 25 years. In connectionist models, processing takes place in the interaction of many simple, massively interconnected units. Connectionist models that can learn are particularly important. In these models, information is learned by repeated presentation; the connections between units change to encode regularities in the environment. The general idea underlying learning can be summarized in the aphorism, based on the work of Donald Hebb (1949), that “cells that fire together, wire together”: the simultaneous activation of cells (or units) leads to an increase in synaptic (or connection) strength. Computational models of mind are very similar. They are scaled-down models of the mind, or parts of it, made from different materials, but which illustrate important principles of how the mind works. What is more, we can learn from them.

Connectionist models are loosely based on a metaphor of the brain, which is a structure made up out of many massively interconnected neurons, each one of which is relatively simple. connectionist modelers usually try to minimize the amount of information hardwired into the system, emphasizing looking at what emerges from the model. Just like traditional AI, connectionism has the virtue that writing a computer program forces you to be explicit about your assumptions. Consequences:

1) led to a focus on the processes that take place inside the boxes of our models. In some cases (e.g., the acquisition of the past tense), this new focus has led to a detailed re- examination of the evidence motivating the models. 2) connectionism has forced us to consider in detail the representations used by the language system. In particular, connectionist approaches can be contrasted with rule-based approaches. In connectionist models rules are not explicitly encoded, but instead emerge as a consequence of statistical generalizations in the input data. this point is controversial, and we shall see through- out the book that the role of explicit rules is still a matter of substantial debate among psycholinguists. 3) shift of emphasis from learning rules to learning through many repeated specific instances has led to an increase in probabilistic models of language acquisition and processing (Chater & Manning, 2006). Probabilistic models have proved particularly influential in language acquisition, where children are thought to learn language by statistical or distributional analysis of what they hear rather than learning explicit rules

Are language processes specific to language? when we understand sentences, do we make use of a general-purpose working memory store, or do we have dedicated stores that can store only information about language? Do children learn language using general-purpose learning rules, or do they make use of information restricted to the linguistic domain? The ideas of innateness, modularity, rules, and language-specific processing are related. There is a divide in psycholinguistics between those who argue for innate languagespecific modules that make extensive use of rules, and those who argue that much or all of language processing is the adaptation of more general cognitive processes.

Are we certain of anything in psycholinguistics? Uncertainty is a fact of life when trying to understand the psychology of language. The discipline is still relatively quite young, and we have a lot to learn.

Are animal utterances language-like?

Harley (2014) ch. 3, and supplemental materials (videos and news articles) on Moodle for a variety of examples: Constant quest: identify aspects of language that only humans manage

But, what is language? Defining “language” is not easy at all: Language is a system that consists of the development, acquisition, maintenance and use of complex systems of communication, particularly the human ability to do so; and a language is any specific example of such a system. ..... https://en.wikipedia.org/wiki/Language (07-Jan-19 16:11) A language is a structured system of communication. Language, in a broader sense, is the method of communication that involves the use of – particularly human – languages... https://en.wikipedia.org/wiki/Language (13-Jan-20 11:45) And the way we characterise or define language may already force answers to some of the questions we ask.

What is special about language? Important characteristics extracted from most definitions: ● human capacity; ● sublexical and lexical units: - phonemes, morphemes, words; ● combinatorial structure (grammar); ●abstraction; e.g. refer to things that aren’t there in the world, using arbitrary symbols/ sounds. These are part of a broader perspective that changed the nature of language research: ● productivity; ● displacement; ● arbitrariness of symbols. Charles Hockett, 1960: “design features of language”

Hockett (1960): shifting questions about language Framework: humans trying to find what is unique about ourselves; Questions of language evolution had ground to a halt: - evidence-free theories about origins of language; - colonial “discoveries” did not reveal primitive humans speaking a protolanguage; - working backward from related languages just didn’t work; studies of language origins considered “futile or crackpot” (p.89); Difficulties of defining what language is… Shift in focus: identify characteristics that language has, then compare these characteristics to other communication systems. At the time: identify which features are shared between evolutionarily related species & thus may have been evolved more recently. Hockett (1960) attempted to sidestep the thorny issue of defining language by listing 16 general properties or design features of spoken human language (see Box 3.1). The

emphasis of his design features is very much on the physical characteristics of spoken languages. Clearly, these are not all necessary defining characteristics— human written language does not display “rapid fading,” yet clearly written language is a form of language. Nevertheless, design features provide a useful framework for thinking about how animal communication systems differ from human language. 1. Vocal-auditory channel (communication occurs by the producer speaking and the receiver hearing) 2. Broadcast transmission and directional reception (a signal travels out in all directions from the speaker but can be localized in space by the hearer) 3. Rapid fading (once spoken, the signal rap- idly disappears and is no longer available for inspection) 4. Interchangeability (adults can be both receivers and transmitters) 5. Complete feedback (speakers can access everything about their productions) 6. Specialization (the amount of energy in the signal is unimportant; a word means the same whether it is whispered or shouted) 7. Semanticity (signals mean something: they relate to the features of the world) 8. Arbitrariness (these symbols are abstract; except with a few onomatopoeic exceptions, they do not resemble what they stand for) 9. Discreteness (the vocabulary is made of discrete units) 10. Displacement (the communication system can be used to refer to things remote in time and space) 11. Openness (the ability to invent new messages) 12. Tradition (the language can be taught and learned) 13. Duality of patterning (only combinations of otherwise meaningless units are meaningful— this can be seen as applying both at the level of sounds and words, and words and sentences) 14. Prevarication (language provides us with the ability to lie and deceive) 15. Reflectiveness (we can communicate about the communication system itself, just as this book is doing) 16. Learnability (the speaker of one language can learn another)

Hockett (1960) “design features” of language 1-3

1. Vocal-auditory channel. Focus was on speech (reading/writing coming later). Other communicative channels are available! 2. Broadcast transmission & directional reception. Speech is not (particularly) directional…. But comprehender can identify the direction it’s coming from. 3. Rapid fading. Must perceive it in real time (until writing systems, recording devices etc). Most animal communication has these features too

Hockett (1960) design features of language 4-6

4. Interchangeability. Anything I can hear, I can say… you can say… we can all say. (compare to mating calls; individual birdsong); 5. Total feedback. We can hear & control what we are saying; 6. Specialization. The purpose is communication (compare to autonomic responses – if I am sweating, you know I am hot but this is not communicative). Again, many kinds of animal communication have these features too.

Hockett (1960) design features of language 7-9

7. Semanticity. Sounds are associated with particular meanings, the more specific the better (vs. “DANGER” calls). But note, many animals learn huge vocabularies (parrots, dolphins, dogs). The more meaning you can understand and express, the more you can understand language. 8. Arbitrariness. Language forms are arbitrarily linked to their referents (although not always). Whale is a small word but a big thing, microorgansim is a big word to describe a longer word. 9. Discreteness. Language can be dissected into smaller units which are categorical rather than continuous. Compare to speech intensity expressing degrees of anger; gestures depicting degrees of size.

Hockett (1960) design features of language 10-12

10. Displacement. Can talk about things that are not present; that do not exist; past, present, future; abstractions and ideas. 11. Productivity. Can produce and understand entirely new utterances. (although this specific example is not particularly novel); combining things constantly in...


Similar Free PDFs