In this paper, we present a novel context-dependent approach to modelling word meaning, and apply it to the modelling of
metaphor. In distributional semantic approaches, words are represented as points in a high dimensional space generated from co-occurrence statistics; the distances between points may then be used to quantifying semantic relationships. Contrary to other approaches which use static, global representations, our approach discovers contextualised representations by dynamically projecting low-dimensional subspaces; in these \textit{ad hoc} spaces, words can be re-represented in an open-ended assortment of geometrical and conceptual configurations as appropriate for particular contexts. We hypothesise that this context-specific re-representation enables a more effective model of the semantics of metaphor than standard static approaches. We test this hypothesis on a dataset of English word dyads rated for degrees of metaphoricity, meaningfulness, and familiarity by human participants. We demonstrate that our model captures these ratings more effectively than a state-of-the-art static model, and does so via the amount of contextualising work inherent in the re-representational process.
Original languageEnglish
Article number413117
Number of pages30
JournalFrontiers in Psychology
Publication statusAccepted/In press - 19 Mar 2019

ID: 44772226