Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Cogn Sci
Impact Factor: 2.824

Building semantic memory from embodied and distributional language experience

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract Humans seamlessly make sense of a rapidly changing environment, using a seemingly limitless knowledgebase to recognize and adapt to most situations we encounter. This knowledgebase is called semantic memory. Embodied cognition theories suggest that we represent this knowledge through simulation: understanding the meaning of coffee entails reinstantiating the neural states involved in touching, smelling, seeing, and drinking coffee. Distributional semantic theories suggest that we are sensitive to statistical regularities in natural language, and that a cognitive mechanism picks up on these regularities and transforms them into usable semantic representations reflecting the contextual usage of language. These appear to present contrasting views on semantic memory, but do they? Recent years have seen a push toward combining these approaches under a common framework. These hybrid approaches augment our understanding of semantic memory in important ways, but current versions remain unsatisfactory in part because they treat sensory‐perceptual and distributional‐linguistic data as interacting but distinct types of data that must be combined. We synthesize several approaches which, taken together, suggest that linguistic and embodied experience should instead be considered as inseparably entangled: just as sensory and perceptual systems are reactivated to understand meaning, so are experience‐based representations endemic to linguistic processing; further, sensory‐perceptual experience is susceptible to the same distributional principles as language experience. This conclusion produces a characterization of semantic memory that accounts for the interdependencies between linguistic and embodied data that arise across multiple timescales, giving rise to concept representations that reflect our shared and unique experiences. This article is categorized under: Psychology > Language Neuroscience > Cognition Linguistics > Language in Mind and Brain
A cartoonized brain depicting how distributed brain regions might contribute to conceptual knowledge under an embodied cognition framework. Pink areas are roughly meant to correspond to cortical regions, and gray areas are roughly meant to correspond to subcortical areasSource: Figure adapted from Allport (1985) and Thompson‐Schill et al. (2006). This figure is licensed under a CC‐BY 4.0 International License
[ Normal View | Magnified View ]
A simple recurrent network. Each layer consists of one or more units, and information (e.g., words, semantic features) flows first from input units, to hidden units, and then to output units. At every timepoint, the context units propagate to the hidden layer, giving the network access to its “memory” of prior states
[ Normal View | Magnified View ]

Browse by Topic

Linguistics > Language in Mind and Brain
Neuroscience > Cognition
Psychology > Language

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts