Last night I read this passage from Maggie Nelson’s “Argonauts”:

A day or two after my love pronouncement, now feral with vulnerability, I sent you the passage from Roland Barthes by Roland Barthes in which Barthes describes how the subject who utters the phrase “I love you” is like “the Argonaut renewing his ship during its voyage without changing its name.” Just as the Argo’s parts may be replaced over time but the boat is still called the Argo, whenever the lover utters the phrase “I love you,” its meaning must be renewed by each use, as “the very task of love and of language is to give to one and the same phrase inflections which will be forever new.



I’ve been thinking about that a lot. An extreme generalization– “words mean what we mean when we say them”– is true to an extent. With enough context the phonemes don’t matter; you know what the person is saying. Like let’s say you’re talking to a person with some aphasia and they sub in the wrong word, or you’re new to a town and don’t know what they call soda. The interesting thing is that this pushes the meaning of the said-word toward the intended meaning in the listener’s language model.



Has anyone looked at the problem of algorithmically guessing the meaning of a word (as as word2vec vector, say) purely from context? Or with phonological/orthographic/etymological clues? The word “quorl” appears only once in Verse’s corpus of public domain poems. I looked it up, but I wonder whether I could have correctly guessed its meaning.

