A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Language Models as Knowledge Bases: On Entity Representations, Storage Capacity, and Paraphrased Queries
[article]
2021
arXiv
pre-print
Pretrained language models have been suggested as a possible alternative or complement to structured knowledge bases. However, this emerging LM-as-KB paradigm has so far only been considered in a very limited setting, which only allows handling 21k entities whose single-token name is found in common LM vocabularies. Furthermore, the main benefit of this paradigm, namely querying the KB using a variety of natural language paraphrases, is underexplored so far. Here, we formulate two basic
arXiv:2008.09036v2
fatcat:dd7vhzkr3besnatvmgyo62o4he