Distributional Word Vectors as Semantic Maps Framework

Authors

  • Amir Bakarov National Research University Higher School of Economics

DOI:

https://doi.org/10.13053/cys-26-3-4356

Keywords:

Word embeddings, distributional wordvectors, semantic maps

Abstract

Distributional Semantics Models are one of the most ubiquitous tools in Natural Language Processing. However, it is still unclear how to optimise such models for specific tasks and how to evaluate them in a general setting (having ability to be successfully applied to any language task in mind). We argue that benefits of intrinsic distributional semantic models evaluation could be questioned since the notion of their “general quality” possibly does not exist; distributional semantic models, however, can be considered as a part of Semantic Maps framework which formalises the notion of linguistic representativeness on the lexical level.

Downloads

Published

2022-08-31

Issue

Section

Articles