Nearly-Unsupervised Hashcode Representations for Biomedical Relation Extraction

Sahil Garg, Aram Galstyan, Greg Ver Steeg, Guillermo Cecchi
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
Recently, kernelized locality sensitive hashcodes have been successfully employed as representations of natural language text, especially showing high relevance to biomedical relation extraction tasks. In this paper, we propose to optimize the hashcode representations in a nearly unsupervised manner, in which we only use data points, but not their class labels, for learning. The optimized hashcode representations are then fed to a supervised classifier following the prior work. This nearly
more » ... ervised approach allows fine-grained optimization of each hash function, which is particularly suitable for building hashcode representations generalizing from a training set to a test set. We empirically evaluate the proposed approach for biomedical relation extraction tasks, obtaining significant accuracy improvements w.r.t. stateof-the-art supervised and semi-supervised approaches.
doi:10.18653/v1/d19-1414 dblp:conf/emnlp/GargGSC19 fatcat:zhxhvwfbnjh43ep4pdsb54exfi