A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
A Systematic Survey on Multi-document Text Summarization
2021
International Journal of Advanced Trends in Computer Science and Engineering
Few of the most popular approaches such as graph based, cluster based and deep learning-based summarization techniques are discussed here along with the evaluation metrics, which can provide an insight ...
abstractive summarization). ...
Ranking Sentences for Extractive Summarization
The importance of text summarization has increased in recent with Reinforcement Learning, Proceedings of
years because of the enormous ...
doi:10.30534/ijatcse/2021/111062021
fatcat:rs7d7bltbba6nj5ph3tx3hpgwm
Deep Reinforcement Learning with Distributional Semantic Rewards for Abstractive Summarization
[article]
2019
arXiv
pre-print
Deep reinforcement learning (RL) has been a commonly-used strategy for the abstractive summarization task to address both the exposure bias and non-differentiable task issues. ...
With distributional semantics, sentence-level evaluation can be obtained, and semantically-correct phrases can also be generated without being limited to the surface form of the reference sentences. ...
A single sentence summarization is paired with a short article. We use the OpenNMT provided version It contains 3.8M training, 189k development instances. ...
arXiv:1909.00141v2
fatcat:japt3clevfhetf6ksv7dqx44qa
Sentence Simplification with Deep Reinforcement Learning
[article]
2017
arXiv
pre-print
We address the simplification problem with an encoder-decoder model coupled with a deep reinforcement learning framework. ...
Our model, which we call Dress (as shorthand for Deep REinforcement Sentence Simplification), explores the space of possible simplifications while learning to optimize a reward function that encourages ...
We are also grateful to Shashi Narayan for supplying us with the output of his system and Wei Xu for her help with this work. ...
arXiv:1703.10931v2
fatcat:xtpuuost3fcghisgo55dajuhn4
Multi-document Summarization via Deep Learning Techniques: A Survey
[article]
2021
arXiv
pre-print
Our survey, the first of its kind, systematically overviews the recent deep learning based MDS models. ...
We propose a novel taxonomy to summarize the design strategies of neural networks and conduct a comprehensive summary of the state-of-the-art. ...
Reinforcement Learning for Multi-document Summarization Reinforcement learning [84, 85, 112] is a cluster of algorithms to deal with sequential decision problems. ...
arXiv:2011.04843v3
fatcat:zfi52xxef5g2tjkaw6hgjpwa5i
Sentence Simplification with Deep Reinforcement Learning
2017
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
We address the simplification problem with an encoder-decoder model coupled with a deep reinforcement learning framework. ...
Our model, which we call DRESS (as shorthand for Deep REinforcement Sentence Simplification), explores the space of possible simplifications while learning to optimize a reward function that encourages ...
We are also grateful to Shashi Narayan for supplying us with the output of his system and Wei Xu for her help with this work. ...
doi:10.18653/v1/d17-1062
dblp:conf/emnlp/ZhangL17
fatcat:6335p7ckunaqzolublbp2a2oxa
Improving Automatic Source Code Summarization via Deep Reinforcement Learning
[article]
2018
arXiv
pre-print
In this paper, we incorporate an abstract syntax tree structure as well as sequential content of code snippets into a deep reinforcement learning framework (i.e., actor-critic network). ...
Comprehensive experiments on a real-world dataset show the effectiveness of our proposed model when compared with some state-of-the-art methods. ...
Figure 4 : 4 An overview of our proposed deep reinforcement learning framework for code summarization. ...
arXiv:1811.07234v1
fatcat:zs6pkkbtpfg2fbloakiw3pnzv4
Deep Reinforcement Learning with Distributional Semantic Rewards for Abstractive Summarization
2019
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Deep reinforcement learning (RL) has been a commonly-used strategy for the abstractive summarization task to address both the exposure bias and non-differentiable task issues. ...
With distributional semantics, sentence-level evaluation can be obtained, and semantically-correct phrases can also be generated without being limited to the surface form of the reference sentences. ...
A single sentence summarization is paired with a short article. We use the OpenNMT provided version It contains 3.8M training, 189k development instances. ...
doi:10.18653/v1/d19-1623
dblp:conf/emnlp/LiLQW19
fatcat:ujcmfbc46jahhldjui5cdcqjtm
Sequential Pattern Mining and Deep Learning to Enhance Readability of Indonesian Text Summarization
2019
International Journal of Advanced Trends in Computer Science and Engineering
The aim of this study is to comprehensively and systematically investigate the literatures that are related with text summarization needs, in particular, to prepare efficient algorithm for Indonesian text ...
using Deep Learning and Sequential Pattern Mining. ...
Figure 7 shows type of DL that have been used for NLP and text summarization that contain unsupervised deep learning [22] , [88] ; deep reinforced model with reinforced algorithm [89] , [90] ; deep ...
doi:10.30534/ijatcse/2019/78862019
fatcat:my2kdyry4fh33kld4sbupzohfq
A Text Abstraction Summary Model Based on BERT Word Embedding and Reinforcement Learning
2019
Applied Sciences
On this basis we propose a novel hybrid model of extractive-abstractive to combine BERT (Bidirectional Encoder Representations from Transformers) word embedding with reinforcement learning. ...
There are two existing methods for text summarization task at present: abstractive and extractive. ...
The extractive network is used to extract the sentences with obvious semantics from the input sequence, and then the abstractive network summarizes the selected sentences and generates the final text summary ...
doi:10.3390/app9214701
fatcat:bv5obrrltfcjxjyd6twklb2zku
A condense-then-select strategy for text summarization
2021
Knowledge-Based Systems
Select-then-compress is a popular hybrid, framework for text summarization due to its high efficiency. ...
To address this limitation, we propose a novel condense-then-select framework for text summarization. Our framework first concurrently condenses each document sentence. ...
It compresses each selected sentence by removing words from the sentence. • Pointer-Gen. + Cov [2] : The pointer-generator network with the coverage mechanism. • Deep-Reinforce [3] : An abstractive method ...
doi:10.1016/j.knosys.2021.107235
fatcat:md4xip7l5bam3hamyjnmnk2liq
A Survey on Neural Network-Based Summarization Methods
[article]
2018
arXiv
pre-print
In addition, we discuss the related techniques that can be applied to the summarization tasks and present promising paths for future research in neural-based summarization. ...
The aim of this literature review is to survey the recent work on neural-based models in automatic text summarization. ...
This encoder-decoder model, called Deep REinforcement Sentence Simplification (DRESS), is trained with the reinforcement learning method that optimizes a task-specific discrete reward function. ...
arXiv:1804.04589v1
fatcat:kx7pzfvunnelnpe5x5pamfyuxa
A Deep Reinforced Model for Zero-Shot Cross-Lingual Summarization with Bilingual Semantic Similarity Rewards
[article]
2020
arXiv
pre-print
In addition, we find that reinforcement learning models with bilingual semantic similarity as rewards generate more fluent sentences than strong baselines. ...
In this work, we propose an end-to-end cross-lingual text summarization model. ...
We also thank Ruihan Zhai, Zhi-Hao Zhou for the help with human evaluation and Anurag Katakkar for post-editing the German-English dataset. ...
arXiv:2006.15454v1
fatcat:6c6f53eavbekhk2dsdssvfyrgu
A Deep Reinforced Model for Zero-Shot Cross-Lingual Summarization with Bilingual Semantic Similarity Rewards
2020
Proceedings of the Fourth Workshop on Neural Generation and Translation
In addition, we find that reinforcement learning models with bilingual semantic similarity as rewards generate more fluent sentences than strong baselines. 1 ...
In this work, we propose an end-to-end cross-lingual text summarization model. ...
We also thank Ruihan Zhai, Zhi-Hao Zhou for the help with human evaluation and Anurag Katakkar for postediting the dataset. ...
doi:10.18653/v1/2020.ngt-1.7
dblp:conf/aclnmt/DouKT20
fatcat:crkajn56lfbsvobfd2v2pkocoy
A Hybrid Model for Paraphrase Detection Combines pros of Text Similarity with Deep Learning
2019
International Journal of Computer Applications
This paper proposes a hybrid model that combines the text similarity approach with deep learning approach in order to improve paraphrase detection. ...
This model verified results with Microsoft Research Paraphrase Corpus (MSPR) dataset, shows that accuracy measure is about 76.6% and F-measure is about 83.5%. ...
with
dynamic
pooling
76.8
%
83.6 %
[21]
Deep
Learning
Simple
distributiona
l semantic
space
73 %
82.3 %
[22]
Deep
Learning
Multi-
perspective
Convolution
al NNs and
structured ...
doi:10.5120/ijca2019919011
fatcat:jjl3modl6vbn7kfuddibziantu
Joint Entity and Relation Extraction with a Hybrid Transformer and Reinforcement Learning Based Model
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
We propose a hybrid deep neural network model to jointly extract the entities and relations, and the model is also capable of filtering noisy data. ...
The hybrid model contains a transformer-based encoding layer, an LSTM entity detection module and a reinforcement learning-based relation classification module. ...
To address the issues of joint extraction and noisy filtration, we propose a hybrid deep neural network model in this paper. ...
doi:10.1609/aaai.v34i05.6471
fatcat:6byt7sgq5zamzewwp6xjxyxcgq
« Previous
Showing results 1 — 15 out of 8,787 results