142 Hits in 5.4 sec

ASER: A Large-scale Eventuality Knowledge Graph [article]

Hongming Zhang and Xin Liu and Haojie Pan and Yangqiu Song and Cane Wing-Ki Leung
2020 arXiv   pre-print
To fill this gap, we develop ASER (activities, states, events, and their relations), a large-scale eventuality knowledge graph extracted from more than 11-billion-token unstructured textual data.  ...  However, existing large-scale knowledge graphs mainly focus on knowledge about entities while ignoring knowledge about activities, states, or events, which are used to describe how entities or things act  ...  CONCLUSIONS In this paper, we introduce ASER, a large-scale eventuality knowledge graph. We extract eventualities from texts based on the dependency graphs.  ... 
arXiv:1905.00270v3 fatcat:5nlfm7grnzh3jk23nvlrjq4vmq

ASER: Towards Large-scale Commonsense Knowledge Acquisition via Higher-order Selectional Preference over Eventualities [article]

Hongming Zhang, Xin Liu, Haojie Pan, Haowen Ke, Jiefu Ou, Tianqing Fang, Yangqiu Song
2022 arXiv   pre-print
Following this principle, we develop a large-scale eventuality (a linguistic term covering activity, state, and event)-based knowledge graph ASER, where each eventuality is represented as a dependency  ...  In total, ASER contains 648 million edges between 438 million eventualities.  ...  Design Principles As aforementioned, ASER is a large-scale eventuality-based knowledge graph.  ... 
arXiv:2104.02137v2 fatcat:dmb2dhqzevc2pd2vef4n5ds5ei

DISCOS: Bridging the Gap between Discourse Knowledge and Commonsense Knowledge [article]

Tianqing Fang, Hongming Zhang, Weiqi Wang, Yangqiu Song, Bin He
2021 arXiv   pre-print
Experiments demonstrate that we can successfully convert discourse knowledge over eventualities from ASER, a large-scale discourse knowledge graph, into inferential if-then commonsense knowledge defined  ...  In total, we can acquire 3.4M ATOMIC-like inferential commonsense knowledge by populating ATOMIC on the core part of ASER.  ...  ASER ASER [9] , a large-scale eventuality-centric knowledge graph that provides explicit discourse relationships between eventualities, is used as the source of linguistic knowledge G.  ... 
arXiv:2101.00154v1 fatcat:77k6t76u5vgb5d67hdb2hnpbse

Enriching Large-Scale Eventuality Knowledge Graph with Entailment Relations [article]

Changlong Yu, Hongming Zhang, Yangqiu Song, Wilfred Ng, Lifeng Shang
2020 arXiv   pre-print
As a result, we construct a large-scale eventuality entailment graph (EEG), which has 10 million eventuality nodes and 103 million entailment edges.  ...  Detailed experiments and analysis demonstrate the effectiveness of the proposed approach and quality of the resulting knowledge graph.  ...  Introduction Large-scale real-world knowledge graph construction is critical to the understanding of human language.  ... 
arXiv:2006.11824v1 fatcat:2bgl2kjiejb2ldzljfr6ui4si4

CoCoLM: COmplex COmmonsense Enhanced Language Model with Discourse Relations [article]

Changlong Yu, Hongming Zhang, Yangqiu Song, Wilfred Ng
2022 arXiv   pre-print
Through the careful training over a large-scale eventuality knowledge graphs ASER, we successfully teach pre-trained language models (i.e., BERT and RoBERTa) rich complex commonsense knowledge among eventualities  ...  Large-scale pre-trained language models have demonstrated strong knowledge representation ability.  ...  Zhang et al. (2020b) builds a large-scale eventuality knowledge graph, ASER, by specifying eventuality relations mined from discourse connectives.  ... 
arXiv:2012.15643v2 fatcat:6uvc5kepdvd6tmpx4vrnvf7q3a

Benchmarking Commonsense Knowledge Base Population with an Effective Evaluation Dataset [article]

Tianqing Fang, Weiqi Wang, Sehyun Choi, Shibo Hao, Hongming Zhang, Yangqiu Song, Bin He
2021 arXiv   pre-print
In this task, CSKBs are grounded to a large-scale eventuality (activity, state, and event) graph to discriminate whether novel triples from the eventuality graph are plausible or not.  ...  In this paper, we benchmark the CSKB population task with a new large-scale dataset by first aligning four popular CSKBs, and then presenting a high-quality human-annotated evaluation set to probe neural  ...  extracted knowledge graph, and then providing a large-scale human-annotated evaluation set.  ... 
arXiv:2109.07679v1 fatcat:b6or4xew2ne53ojrd3kpvn4lou

From Discourse to Narrative: Knowledge Projection for Event Relation Extraction [article]

Jialong Tang, Hongyu Lin, Meng Liao, Yaojie Lu, Xianpei Han, Le Sun, Weijian Xie, Jin Xu
2021 arXiv   pre-print
Current event-centric knowledge graphs highly rely on explicit connectives to mine relations between events.  ...  In this paper, we propose a knowledge projection paradigm for event relation extraction: projecting discourse knowledge to narratives by exploiting the commonalities between them.  ...  These methods extract event knowledge from massive raw corpora with or without little human intervention, which makes them scalable solutions to build large-scale Knowledge Projection Cause Discourse  ... 
arXiv:2106.08629v1 fatcat:6murrmqrwbexpecpzwvrq6ufga

TransOMCS: From Linguistic Graphs to Commonsense Knowledge [article]

Hongming Zhang, Daniel Khashabi, Yangqiu Song, Dan Roth
2020 arXiv   pre-print
The result is a conversion of ASER [Zhang et al., 2020], a large-scale selectional preference knowledge resource, into TransOMCS, of the same representation as ConceptNet [Liu and Singh, 2004] but two  ...  Conventional methods of acquiring commonsense knowledge generally require laborious and costly human annotations, which are not feasible on a large scale.  ...  As both the internal structure of eventualities and external relations between eventualities could be converted to commonsense knowledge, we treat all the eventualities and eventuality pairs in ASER as  ... 
arXiv:2005.00206v1 fatcat:zgbyczzkprhtxoanvihdil2wk4

On the Role of Conceptualization in Commonsense Knowledge Graph Construction [article]

Mutian He, Yangqiu Song, Kun Xu, Dong Yu
2020 arXiv   pre-print
Commonsense knowledge graphs (CKGs) like Atomic and ASER are substantially different from conventional KGs as they consist of much larger number of nodes formed by loosely-structured text, which, though  ...  We build synthetic triples by conceptualization, and further formulate the task as triple classification, handled by a discriminatory model with knowledge transferred from pretrained language models and  ...  ASER ASER (Zhang et al., 2019a ) is a large-scale CKG of 194M nodes representing verb-centric eventualities matching certain patterns (e.g., s-v-o for I love dogs and s-v-v-o for I want to eat an apple  ... 
arXiv:2003.03239v2 fatcat:tyv6n4b2tnfevedx3dh2zei5iu

How Commonsense Knowledge Helps with Natural Language Tasks: A Survey of Recent Resources and Methodologies [article]

Yubo Xie, Pearl Pu
2021 arXiv   pre-print
In this paper, we give an overview of commonsense reasoning in natural language processing, which requires a deeper understanding of the contexts and usually involves inference over implicit external knowledge  ...  natural language problems that take advantage of external knowledge bases.  ...  ASER (2020) ASER [Zhang et al., 2020] is a large-scale eventuality knowledge graph automatically extracted from more than 11-billion-token unstructured textual data.  ... 
arXiv:2108.04674v1 fatcat:feap4jp4mrhbdfhldylzltpota

Modeling Dense Cross-Modal Interactions for Joint Entity-Relation Extraction

Shan Zhao, Minghao Hu, Zhiping Cai, Fang Liu
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
In this paper, we propose a deep Cross-Modal Attention Network (CMAN) for joint entity and relation extraction.  ...  As both the internal structure of eventualities and external relations between eventualities could be converted to commonsense knowledge, we treat all the eventualities and eventuality pairs in ASER as  ...  In summary, our contributions are: (1) We formally define the task of mining commonsense knowledge from linguistic graphs and propose an approach to address it; (2) We construct a large-scale commonsense  ... 
doi:10.24963/ijcai.2020/554 dblp:conf/ijcai/ZhangKSR20 fatcat:jztdls6m3bcc7nadccbio37rju

AFEC: A Knowledge Graph Capturing Social Intelligence in Casual Conversations [article]

Yubo Xie, Junze Li, Pearl Pu
2022 arXiv   pre-print
For this body of knowledge to be comprehensive and meaningful, we curated a large-scale corpus from the r/CasualConversation SubReddit.  ...  The knowledge captured in this graph bears potential for conversational systems to understand how people offer acknowledgement, consoling, and a wide range of empathetic responses in social conversations  ...  ASER (Zhang et al., 2020 ) is a large-scale eventuality knowledge graph automatically extracted from more than 11-billion-token unstructured textual data.  ... 
arXiv:2205.10850v1 fatcat:gr4fhh672bcghnb6ldhqgbgp5m

Extracting event and their relations from texts: A survey on recent research progress and challenges

Kang Liu, Yubo Chen, Jian Liu, Xinyu Zuo, Jun Zhao
2020 AI Open  
A B S T R A C T Event is a common but non-negligible knowledge type.  ...  This paper summaries some constructed event-centric knowledge graphs and the recent typical approaches for event and event relation extraction, besides task description, widely used evaluation datasets  ...  ASER designed several high-quality patterns based on dependency parsing results and extract all eventualities over large-scale corpora.  ... 
doi:10.1016/j.aiopen.2021.02.004 fatcat:qxbcmk55vzcb5nznhgfgwrbe4u

Research and Science Today No. 1(5)/2013

Flavius Marcau, Elena Adam, Elena Triscas, Raluca-Maria Nicoara, Ruben Ioan Ivan, Paul Duta, Ioan Panait, Viorella Manolache, Alin Andronache, Andreea Trandafir, Anda Taropa-Iacob, Andreea-Emilia Duta (+20 others)
2013 Social Science Research Network  
These presents were meant to show the good intentions of Han emperor towards his counterpart and, eventually, to reach a consensus for a bilateral collaboration or treaty.  ...  has its dichotomy, the exiled ones have their dichotomies and at a microscopic scale each dichotomy has its typologies.  ...  The journal promotes original studies contributing to the progress of knowledge and it is motivated by the need to address issues of theory and practice in the areas mentioned above.  ... 
doi:10.2139/ssrn.2245816 fatcat:7q63qqphwzca7peb4bpfwf2hea

A Survey of Knowledge-Intensive NLP with Pre-Trained Language Models [article]

Da Yin, Li Dong, Hao Cheng, Xiaodong Liu, Kai-Wei Chang, Furu Wei, Jianfeng Gao
2022 arXiv   pre-print
To address this challenge, large numbers of pre-trained language models augmented with external knowledge sources are proposed and in rapid development.  ...  NLP tasks, and knowledge fusion methods.  ...  Even if we compose large-scale commonsense knowledge sources with the help of knowledge acquisition methods, we are still likely to miss a large body of commonsense knowledge used in our daily life.  ... 
arXiv:2202.08772v1 fatcat:zayodfdoq5gk7imgvo6awfadui
« Previous Showing results 1 — 15 out of 142 results