A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Towards Incremental Learning of Word Embeddings Using Context Informativeness
2019
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
In this paper, we investigate the task of learning word embeddings from very sparse data in an incremental, cognitively-plausible way. We focus on the notion of informativeness, that is, the idea that some content is more valuable to the learning process than other. We further highlight the challenges of online learning and argue that previous systems fall short of implementing incrementality. Concretely, we incorporate informativeness in a previously proposed model of nonce learning, using it
doi:10.18653/v1/p19-2022
dblp:conf/acl/KabbachGH19
fatcat:n62vpsecangdla3iuq4nrepzkm