Filters








1,460,278 Hits in 7.9 sec

Language Models as Knowledge Bases?

Fabio Petroni, Tim Rocktäschel, Sebastian Riedel, Patrick Lewis, Anton Bakhtin, Yuxiang Wu, Alexander Miller
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
Language models have many advantages over structured knowledge bases: they require no schema engineering, allow practitioners to query about an open class of relations, are easy to extend to more data,  ...  Whilst learning linguistic knowledge, these models may also be storing relational knowledge present in the training data, and may be able to answer queries structured as "fillin-the-blank" cloze statements  ...  Given the above qualities of language models as potential representations of relational knowledge, we are interested in the relational knowledge already present in pretrained off-the-shelf language models  ... 
doi:10.18653/v1/d19-1250 dblp:conf/emnlp/PetroniRRLBWM19 fatcat:aalqzrmjf5gmjg6l2imffkbgky

Language Models as Knowledge Bases? [article]

Fabio Petroni, Tim Rocktäschel, Patrick Lewis, Anton Bakhtin, Yuxiang Wu, Alexander H. Miller, Sebastian Riedel
2019 arXiv   pre-print
Language models have many advantages over structured knowledge bases: they require no schema engineering, allow practitioners to query about an open class of relations, are easy to extend to more data,  ...  Whilst learning linguistic knowledge, these models may also be storing relational knowledge present in the training data, and may be able to answer queries structured as "fill-in-the-blank" cloze statements  ...  Given the above qualities of language models as potential representations of relational knowledge, we are interested in the relational knowledge already present in pretrained off-the-shelf language models  ... 
arXiv:1909.01066v2 fatcat:vxq4o7qx5fgt3jha4njhpjrwqu

Language Models As or For Knowledge Bases [article]

Simon Razniewski, Andrew Yates, Nora Kassner, Gerhard Weikum
2021 arXiv   pre-print
Pre-trained language models (LMs) have recently gained attention for their potential as an alternative to (or proxy for) explicit knowledge bases (KBs).  ...  In particular, we offer qualitative arguments that latent LMs are not suitable as a substitute for explicit KBs, but could play a major role for augmenting and curating KBs.  ...  Introduction The ability of pre-trained contextual language models (LMs) to capture and retrieve factual knowledge has recently stirred discussion as to what extent LMs could be an alternative to, or at  ... 
arXiv:2110.04888v1 fatcat:x6et52j2zvhfdn7ecebz46hgbq

Knowledgeable or Educated Guess? Revisiting Language Models as Knowledge Bases [article]

Boxi Cao, Hongyu Lin, Xianpei Han, Le Sun, Lingyong Yan, Meng Liao, Tong Xue, Jin Xu
2021 arXiv   pre-print
Previous literatures show that pre-trained masked language models (MLMs) such as BERT can achieve competitive factual knowledge extraction performance on some datasets, indicating that MLMs can potentially  ...  Our findings shed light on the underlying predicting mechanisms of MLMs, and strongly question the previous conclusion that current MLMs can potentially serve as reliable factual knowledge bases.  ...  Related Work The great success of Pre-trained Language Models (PLMs) raises the question of whether PLMs can be directly used as reliable knowledge bases.  ... 
arXiv:2106.09231v1 fatcat:kmfjzrpq65hf3lc5h23x2rwsge

Time-Aware Language Models as Temporal Knowledge Bases [article]

Bhuwan Dhingra, Jeremy R. Cole, Julian Martin Eisenschlos, Daniel Gillick, Jacob Eisenstein, William W. Cohen
2021 arXiv   pre-print
But language models (LMs) are trained on snapshots of data collected at a specific moment in time, and this can limit their utility, especially in the closed-book setting where the pretraining corpus must  ...  We also show that models trained with temporal context can be efficiently "refreshed" as new data arrives, without the need for retraining from scratch.  ...  Introduction Language models (LMs) have recently been suggested as repositories of real-world knowledge (Petroni et al., 2019) and there is much interest in using them for tasks such as closed-book question  ... 
arXiv:2106.15110v1 fatcat:ga35w3cvrzcprckalhzpv5x25i

A Review on Language Models as Knowledge Bases [article]

Badr AlKhamissi, Millicent Li, Asli Celikyilmaz, Mona Diab, Marjan Ghazvininejad
2022 arXiv   pre-print
Recently, there has been a surge of interest in the NLP community on the use of pretrained Language Models (LMs) as Knowledge Bases (KBs).  ...  The resulting LM can be probed for different kinds of knowledge and thus acting as a KB. This has a major advantage over traditional KBs in that this method requires no human supervision.  ...  A.1.1 External Knowledge Graphs Prior work on LMs-as-KBs involves the explicit creation of external knowledge graphs (KG) that Maillard et al. (2021)  ... 
arXiv:2204.06031v1 fatcat:nrixk5zcrffkdmhrlwifnga6iu

Pre-trained Language Models as Prior Knowledge for Playing Text-based Games [article]

Ishika Singh and Gargi Singh and Ashutosh Modi
2021 arXiv   pre-print
These text-based games are challenging for artificial agents, as it requires an understanding of and interaction using natural language in a partially observable environment.  ...  In this paper, we improve the semantic understanding of the agent by proposing a simple RL with LM framework where we use transformer-based language models with Deep RL models.  ...  Given the requirements for IF games, we propose a pre-trained transformer-based [20] language model (LM) as a candidate for equipping the RL agent with both language understanding capabilities and real-world  ... 
arXiv:2107.08408v2 fatcat:nhju6wsz3rdb3j4impslmiqfda

Language Models as Knowledge Bases: On Entity Representations, Storage Capacity, and Paraphrased Queries [article]

Benjamin Heinzerling, Kentaro Inui
2021 arXiv   pre-print
that language models can indeed serve as knowledge bases.  ...  Pretrained language models have been suggested as a possible alternative or complement to structured knowledge bases.  ...  We gave a positive answer to Petroni et al. (2019) 's question if language models can serve as knowledge bases.  ... 
arXiv:2008.09036v2 fatcat:dd7vhzkr3besnatvmgyo62o4he

Can Generative Pre-trained Language Models Serve as Knowledge Bases for Closed-book QA? [article]

Cunxiang Wang and Pai Liu and Yue Zhang
2021 arXiv   pre-print
Recent work has investigated the interesting question using pre-trained language models (PLMs) as knowledge bases for answering open questions.  ...  Some promising directions are found, including decoupling the knowledge memorizing process and the QA finetune process, forcing the model to recall relevant knowledge when question answering.  ...  ., 2019; is to use pretrained language models (PLMs) as knowledge bases (KBs) and answer questions according to internal knowledge the model contains.  ... 
arXiv:2106.01561v1 fatcat:fmmmolob4jerplilt7lqibztvu

Language Models as a Knowledge Source for Cognitive Agents [article]

Robert E. Wray, III and James R. Kirk and John E. Laird
2021 arXiv   pre-print
use to exploit the knowledge within a language model.  ...  The resulting analysis outlines both the challenges and opportunities for using language models as a new knowledge source for cognitive systems.  ...  The authors thank Charles Newton of Soar Technology who provided suggestions and guidance on language models and the anonymous reviewers, who provided incisive feedback and suggestions, including the recommendation  ... 
arXiv:2109.08270v3 fatcat:ijqjydoourak3o67gsp7uzwhwi

Knowledge and Implicature: Modeling Language Understanding as Social Cognition

Noah D. Goodman, Andreas Stuhlmüller
2013 Topics in Cognitive Science  
This model predicts an interaction between the speaker's knowledge state and the listener's interpretation.  ...  Is language understanding a special case of social cognition?  ...  A rational speech-act model We view language comprehension as a rational inference based on an intuitive theory of language production. Our setting is illustrated in Fig. 1 .  ... 
doi:10.1111/tops.12007 pmid:23335578 fatcat:32hmc75rznb45k6twjjzbiainm

Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents [article]

Wenlong Huang, Pieter Abbeel, Deepak Pathak, Igor Mordatch
2022 arXiv   pre-print
Can world knowledge learned by large language models (LLMs) be used to act in interactive environments?  ...  The conducted human evaluation reveals a trade-off between executability and correctness but shows a promising sign towards extracting actionable knowledge from language models.  ...  Models Notation Summary: LM P : text completion language model (also referred as Planning LM) LM T : text embedding language model (also referred as Translation LM) {(T i , E i )} N i=1 : demonstration  ... 
arXiv:2201.07207v2 fatcat:2ighvy7jsfaxfllziu4yngv3n4

Are Pretrained Language Models Symbolic Reasoners Over Knowledge? [article]

Nora Kassner, Benno Krojer, Hinrich Schütze
2020 arXiv   pre-print
How can pretrained language models (PLMs) learn factual knowledge from the training set? We investigate the two most important mechanisms: reasoning and memorization.  ...  For memorization, we identify schema conformity (facts systematically supported by other facts) and frequency as key factors for its success.  ...  Our setup is similar to link prediction in the knowledge base domain and therefore can be seen as a natural extension of the question: "Language models as knowledge bases?" (Petroni et al., 2019) .  ... 
arXiv:2006.10413v2 fatcat:nesqiwq5zzbelfwlyh7v4xi6wq

Modeling Value Evaluation of Semantics Aided Secondary Language Acquisition as Model Driven Knowledge Management [chapter]

Yucong Duan, Christophe Cruz, Abdelrahman Osman Elfaki, Yang Bai, Wencai Du
2013 Studies in Computational Intelligence  
Thereafter we propose the quantity measure for the improvement of learning efficiency in terms of reuse level for semantics aided secondary language learning from the perspective of value based analysis  ...  Firstly we model the general learning process from cognitive linguistic perspective at the memory level.  ...  Value based analysis on knowledge management model Value based analysis has been adopted as a fundamental approach in linguistics [4] which is used to justify the motivation of the evolution of languages  ... 
doi:10.1007/978-3-319-00804-2_20 fatcat:rs3hlzedovfr7galmzdpe76m2i

Knowledge Graphs as Context Models: Improving the Detection of Cross-Language Plagiarism with Paraphrasing [chapter]

Marc Franco-Salvador, Parth Gupta, Paolo Rosso
2014 Lecture Notes in Computer Science  
In order to improve the paraphrasing detection, we use a knowledge graph-based approach to obtain and compare context models of document fragments in different languages.  ...  models.  ...  Cross-Language Alignment based Similarity Analysis Cross-language alignment based similarity analysis (CL-ASA) model measures the similarity between two documents d and d , from two different languages  ... 
doi:10.1007/978-3-642-54798-0_12 fatcat:ncsmncrjqjbnpd5tzmotyoh5xa
« Previous Showing results 1 — 15 out of 1,460,278 results