701
Views
10
CrossRef citations to date
0
Altmetric
REVIEW ARTICLE

When does abstraction occur in semantic memory: insights from distributional models

Pages 1338-1346 | Received 02 Dec 2016, Accepted 17 Jan 2018, Published online: 25 Jan 2018
 

ABSTRACT

Abstraction is a core principle of Distributional Semantic Models (DSMs) that learn semantic representations for words by applying dimensional reduction to statistical redundancies in language. Although the posited learning mechanisms vary widely, virtually all DSMs are prototype models in that they create a single abstract representation of a word’s meaning. This stands in stark contrast to accounts of categorisation that have very much converged on the superiority of exemplar models. However, there is a small but growing group of accounts in psychology, linguistics, and information retrieval that are exemplar-based semantic models. These models borrow many of the ideas that have led to the prominence of exemplar models in fields such as categorisation. Exemplar-based DSMs posit only an episodic store, not a semantic one. Rather than applying abstraction mechanisms at learning, these DSMs posit that semantic abstraction is an emergent artifact of retrieval from episodic memory.

Acknowledgements

This work was supported by NSF BCS-1056744 and IES R305A150546. I would like to thank Randy Jamieson, Melody Dye, and Brendan Johns for helpful input and discussions.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1 LSA and word2vec are formally equivalent: Levy and Goldberg (Citation2014) demonstrated analytically how the SGNS architecture of word2vec is implicitly factorizing a word-by-context matrix whose cell values are shifted PMI values.

2 The model’s direction can also be inverted, using the word to predict the context (SGNS) rather than using the context to predict the word (CBOW).

3 The bulk of the evidence used by Tulving to argue for distinct semantic and episodic memory systems was from neuropsychological patients.

4 The model is simply referred to as the “semantics model” in Kwantes’ (Citation2005) original paper, but “Constructed Semantics Model” has become it’s popular name among semantic modelers because semantic representations are constructed on the fly from episodic memory in the model.

5 An exception here is the topic model, which uses conditional probabilities, so it is not subject to metric restrictions of spatial models (e.g., Griffiths, Steyvers, & Tenenbaum, Citation2007).

6 And essentially the same architecture has been used by Goldinger (Citation1998) to explain “abstract” qualities of spoken word representation from episodic memory retrieval.

Additional information

Funding

This work was supported by National Science Foundation [grant number BCS-1056744] and Institute of Education Sciences [grant number R305A150546].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 444.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.