CoCoLM: COmplex COmmonsense Enhanced Language Model with Discourse Relations [article]

Changlong Yu, Hongming Zhang, Yangqiu Song, Wilfred Ng
2022 arXiv   pre-print
Large-scale pre-trained language models have demonstrated strong knowledge representation ability. However, recent studies suggest that even though these giant models contains rich simple commonsense knowledge (e.g., bird can fly and fish can swim.), they often struggle with the complex commonsense knowledge that involves multiple eventualities (verb-centric phrases, e.g., identifying the relationship between "Jim yells at Bob" and "Bob is upset").To address this problem, in this paper, we
more » ... se to help pre-trained language models better incorporate complex commonsense knowledge. Different from existing fine-tuning approaches, we do not focus on a specific task and propose a general language model named CoCoLM. Through the careful training over a large-scale eventuality knowledge graphs ASER, we successfully teach pre-trained language models (i.e., BERT and RoBERTa) rich complex commonsense knowledge among eventualities. Experiments on multiple downstream commonsense tasks that requires the correct understanding of eventualities demonstrate the effectiveness of CoCoLM.
arXiv:2012.15643v2 fatcat:6uvc5kepdvd6tmpx4vrnvf7q3a