Pretrained Transformers for Simple Question Answering over Knowledge Graphs [article]

D. Lukovnikov, A. Fischer, J. Lehmann
2020 arXiv   pre-print
Answering simple questions over knowledge graphs is a well-studied problem in question answering. Previous approaches for this task built on recurrent and convolutional neural network based architectures that use pretrained word embeddings. It was recently shown that finetuning pretrained transformer networks (e.g. BERT) can outperform previous approaches on various natural language processing tasks. In this work, we investigate how well BERT performs on SimpleQuestions and provide an
more » ... of both BERT and BiLSTM-based models in datasparse scenarios.
arXiv:2001.11985v1 fatcat:r5dz2n47bzefrd5skjkimgwtym