Filters








591 Hits in 4.5 sec

Enhancing Pointer Network for Sentence Ordering with Pairwise Ordering Predictions

Yongjing Yin, Fandong Meng, Jinsong Su, Yubin Ge, Lingeng Song, Jie Zhou, Jiebo Luo
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
To address this deficiency, we propose to enhance the pointer network decoder by using two pairwise ordering prediction modules: The FUTURE module predicts the relative orientations of other unordered  ...  Dominant sentence ordering models use a pointer network decoder to generate ordering sequences in a left-to-right fashion.  ...  Acknowledgments The authors were supported by Beijing Advanced Innovation Center for Language Resources, National Natural Sci-  ... 
doi:10.1609/aaai.v34i05.6492 fatcat:ctysouwm2rb33id5f6a6ornfnu

Local and Global Context-Based Pairwise Models for Sentence Ordering [article]

Ruskin Raj Manku, Aditya Jyoti Paul
2021 arXiv   pre-print
Sentence Ordering refers to the task of rearranging a set of sentences into the appropriate coherent order.  ...  Our proposed encoding method utilizes the paragraph's rich global contextual information to predict the pairwise order using novel transformer architectures.  ...  Farahnak and Kosseim (2021) enhance pointer network with conditional sentence representation. Recently there has been a surge in the use of pairwise models.  ... 
arXiv:2110.04291v1 fatcat:ilwqxxl4pjcutkuojvx7dcw6b4

Improving Graph-based Sentence Ordering with Iteratively Predicted Pairwise Orderings [article]

Shaopeng Lai, Ante Wang, Fandong Meng, Jie Zhou, Yubin Ge, Jiali Zeng, Junfeng Yao, Degen Huang, Jinsong Su
2021 arXiv   pre-print
In this paper, we propose a novel sentence ordering framework which introduces two classifiers to make better use of pairwise orderings for graph-based sentence ordering.  ...  Specially, given an initial sentence-entity graph, we first introduce a graph-based classifier to predict pairwise orderings between linked sentences.  ...  We also thank the reviewers for their insightful comments.  ... 
arXiv:2110.06446v1 fatcat:j37w65w7w5ayphopita2wbv5ly

Topic-Guided Coherence Modeling for Sentence Ordering by Preserving Global and Local Information

Byungkook Oh, Seungmin Seo, Cheolheon Shin, Eunju Jo, Kyong-Ho Lee
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
With the coherent topical context matching, we promote local dependencies that help identify the tight semantic connections for sentence ordering.  ...  Moreover, to predict the next sentence, we capture topic-enhanced sentence-pair interactions between the current predicted sentence and each next-sentence candidate.  ...  It then predicts the next sentence from topic-enhanced sentence-pair interactions with the coherent topical context as local information.  ... 
doi:10.18653/v1/d19-1232 dblp:conf/emnlp/OhSSJL19 fatcat:wkp764xugrhorjtsnymo5rpp54

Pruned Graph Neural Network for Short Story Ordering [article]

Melika Golestani, Zeinab Borhanifard, Farnaz Tahmasebian, Heshaam Faili
2022 arXiv   pre-print
This paper is proposing a new approach based on the graph neural network approach to encode a set of sentences and learn orderings of short stories.  ...  We propose a new method for constructing sentence-entity graphs of short stories to create the edges between sentences and reduce noise in our graph by replacing the pronouns with their referring entities  ...  ., 2020) , an enhancing pointer network based on two pairwise ordering prediction modules, The FUTURE and HISTORY module, is employed to decode paragraphs.  ... 
arXiv:2203.06778v1 fatcat:kauj2v2fazeilesgoc3jmea6ju

Hierarchical Attention Networks for Sentence Ordering

Tianming Wang, Xiaojun Wan
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Sentence ordering, the goal of which is to organize a set of sentences into a coherent text, is a commonly used task to learn and evaluate the model.  ...  In this paper, we propose a novel hierarchical attention network that captures word clues and dependencies between sentences to address this problem.  ...  We thank the anonymous reviewers for their helpful comments. Xiaojun Wan is the corresponding author.  ... 
doi:10.1609/aaai.v33i01.33017184 fatcat:oeiihuxbmrcbfcwf3pkb32grvy

BERT4SO: Neural Sentence Ordering by Fine-tuning BERT [article]

Yutao Zhu, Jian-Yun Nie, Kun Zhou, Shengchao Liu, Yabo Ling, Pan Du
2021 arXiv   pre-print
In this work, we propose a new method, named BERT4SO, by fine-tuning BERT for sentence ordering.  ...  Sentence ordering aims to arrange the sentences of a given text in the correct order. Recent work frames it as a ranking problem and applies deep neural networks to it.  ...  One drawback of such approaches is that the current time step prediction depends on the previous predictions, making it difficult for ordering a large set of sentences.  ... 
arXiv:2103.13584v3 fatcat:o2hgpwecanhwncmq5bt7vw6uzu

A New Sentence Ordering Method Using BERT Pretrained Model [article]

Melika Golestani, Seyedeh Zahra Razavi, Heshaam Faili
2021 arXiv   pre-print
The task of sentence ordering is proposed to learn succession of events with applications in AI tasks.  ...  In this paper, we propose a method for sentence ordering which does not need a training phase and consequently a large corpus for learning.  ...  It is a pairwise model which uses Pointer Network for sentence ordering. We re-implement these methods on a part of ROCStories to demonstrate the effect of lack of data on model training. VII.  ... 
arXiv:2108.11994v1 fatcat:nsbtpu46bvel7bbajpr7ti4iry

Online Conversation Disentanglement with Pointer Networks [article]

Tao Yu, Shafiq Joty
2020 arXiv   pre-print
Our experiments on the Ubuntu IRC dataset show that our method achieves state-of-the-art performance in both link and conversation prediction tasks.  ...  In this work, we propose an end-to-end online framework for conversation disentanglement that avoids time-consuming domain-specific feature engineering.  ...  For the higher-order information, we train a binary pairwise classifier that decides whether two utterances should be in the same conversation.  ... 
arXiv:2010.11080v1 fatcat:555fv2rufvdibkca42y3myqjuy

Searching for Effective Neural Extractive Summarization: What Works and What's Next

Ming Zhong, Pengfei Liu, Danqing Wang, Xipeng Qiu, Xuanjing Huang
2019 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics  
The recent years have seen remarkable success in the use of deep neural networks on text summarization.  ...  Hopefully, our work could provide more clues for future research on extractive summarization. Source code will be available on Github 1 .  ...  Acknowledgment We thank Jackie Chi Kit Cheung, Peng Qian for useful comments and discussions. We would like to thank the anonymous reviewers for their valuable comments.  ... 
doi:10.18653/v1/p19-1100 dblp:conf/acl/ZhongLWQH19 fatcat:rldnxmzjfrhs3bjjer7vpqvd5y

Weak Supervision helps Emergence of Word-Object Alignment and improves Vision-Language Tasks [article]

Corentin Kervadec, Grigory Antipov, Moez Baccouche, Christian Wolf
2019 arXiv   pre-print
In particular, this new learning signal allows obtaining SOTA-level performances on GQA dataset (VQA task) with pre-trained models without finetuning on the task, and a new SOTA on NLVR2 dataset (Language-driven  ...  Indeed, such relations are frequently assumed to be implicitly learned during training from application-specific losses, mostly cross-entropy for classification.  ...  Therefore, we choose to combine these descriptions in order to obtain sentences with one, two or three pointers.  ... 
arXiv:1912.03063v1 fatcat:ete5vlbecfh67e324zfeo7qahu

Multi-Fact Correction in Abstractive Text Summarization [article]

Yue Dong, Shuohang Wang, Zhe Gan, Yu Cheng, Jackie Chi Kit Cheung, Jingjing Liu
2020 arXiv   pre-print
However, system-generated abstractive summaries often face the pitfall of factual inconsistency: generating incorrect facts with respect to the source text.  ...  Our models employ single or multi-masking strategies to either iteratively or auto-regressively replace entities in order to ensure semantic consistency w.r.t. the source text, while retaining the syntactic  ...  Based on h t , we use a two-pointer network to predict the start and end positions of the answer entity in the source (encoder's hidden states).  ... 
arXiv:2010.02443v1 fatcat:5o7thl7smbewlauzaijlm7y3zu

Don't Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization

Shashi Narayan, Shay B. Cohen, Mirella Lapata
2018 Zenodo  
The idea is to create a short, one-sentence news summary answering the question "What is the article about?".  ...  We propose a novel abstractive model which is conditioned on the article's topics and based entirely on convolutional neural networks.  ...  Our topic-enhanced model calibrates longrange dependencies with globally salient content.  ... 
doi:10.5281/zenodo.2399762 fatcat:747rozi4wjgw7f4oaw3q2dqjre

Don't Give Me the Details, Just the Summary! Topic-Aware Convolutional Neural Networks for Extreme Summarization [article]

Shashi Narayan and Shay B. Cohen and Mirella Lapata
2018 arXiv   pre-print
The idea is to create a short, one-sentence news summary answering the question "What is the article about?".  ...  We propose a novel abstractive model which is conditioned on the article's topics and based entirely on convolutional neural networks.  ...  Our topic-enhanced model calibrates longrange dependencies with globally salient content.  ... 
arXiv:1808.08745v1 fatcat:3daenro2tndr3ehgq3bbky3d7q

Point at the Triple: Generation of Text Summaries from Knowledge Base Triples

Pavlos Vougiouklis, Eddy Maddalena, Jonathon Hare, Elena Simperl
2020 The Journal of Artificial Intelligence Research  
Our approach is based on a pointer-generator network, which, in addition to generating regular words from a fixed target vocabulary, is able to verbalise triples in several ways.  ...  Acknowledgements We thank the reviewers for their thorough and insightful feedback. This research is partially supported by the Data Stories and ACTION projects.  ...  .05 using pairwise one-way ANOVA of a system against Triples2GRU, Pointer-Generator and Ours w/o Surf.  ... 
doi:10.1613/jair.1.11694 fatcat:3ikrw3lzunhanlo6xumhbyv3pu
« Previous Showing results 1 — 15 out of 591 results