A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
CoCoLM: COmplex COmmonsense Enhanced Language Model with Discourse Relations
[article]
2022
arXiv
pre-print
Large-scale pre-trained language models have demonstrated strong knowledge representation ability. However, recent studies suggest that even though these giant models contains rich simple commonsense knowledge (e.g., bird can fly and fish can swim.), they often struggle with the complex commonsense knowledge that involves multiple eventualities (verb-centric phrases, e.g., identifying the relationship between "Jim yells at Bob" and "Bob is upset").To address this problem, in this paper, we
arXiv:2012.15643v2
fatcat:6uvc5kepdvd6tmpx4vrnvf7q3a