A Vector Model for Type-Theoretical Semantics

Konstantin Sokolov
2016 Proceedings of the 1st Workshop on Representation Learning for NLP  
Vector models of distributional semantics can be viewed as a geometric interpretation of a fragment of dependent type theory. By extending to a bigger fragment to include the dependent product we achieve a significant increase in expressive power of vector models, which allows for an implementation of contextual adaptation of word meanings in the compositional setting.
doi:10.18653/v1/w16-1627 dblp:conf/rep4nlp/Sokolov16 fatcat:2jfcecoghfg3vdjxcrxmpxcyga