A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Neural Latent Extractive Document Summarization
[article]
2018
arXiv
pre-print
Extractive summarization models require sentence-level labels, which are usually created heuristically (e.g., with rule-based methods) given that most summarization datasets only have document-summary ...
Since these labels might be suboptimal, we propose a latent variable extractive model where sentences are viewed as latent variables and sentences with activated variables are used to infer gold summaries ...
Model We first introduce the neural extractive summarization model upon which our latent model is based on. ...
arXiv:1808.07187v2
fatcat:bt5c7mo62fdzdcb5ujvzperoky
Neural Latent Extractive Document Summarization
2018
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Extractive summarization models require sentence-level labels, which are usually created heuristically (e.g., with rule-based methods) given that most summarization datasets only have document-summary ...
Since these labels might be suboptimal, we propose a latent variable extractive model where sentences are viewed as latent variables and sentences with activated variables are used to infer gold summaries ...
Model We first introduce the neural extractive summarization model upon which our latent model is based on. ...
doi:10.18653/v1/d18-1088
dblp:conf/emnlp/ZhangLWZ18
fatcat:35vmelf5cfea3axksczpvti3iy
Enhancing Extractive Text Summarization with Topic-Aware Graph Neural Networks
[article]
2020
arXiv
pre-print
Moreover, our model integrates a joint neural topic model (NTM) to discover latent topics, which can provide document-level features for sentence selection. ...
To address these issues, this paper proposes a graph neural network (GNN)-based extractive summarization model, enabling to capture inter-sentence relationships efficiently via graph-structured document ...
help summarize documents. We propose a novel graph-based neural extractive summarization model, which innovatively incorporates latent topics into graph propagation via a joint neural topic model. ...
arXiv:2010.06253v1
fatcat:kxlj4h2cszhcvh3uis5vlfs7ni
Topic-Aware Encoding for Extractive Summarization
[article]
2022
arXiv
pre-print
The Sequence-to-Sequence (Seq2Seq) based neural summarization model is the most widely used in the summarization field due to its high performance. ...
Document summarization provides an instrument for faster understanding the collection of text documents and has several real-life applications. ...
OUR MODEL In this section, we describe our framework that leverages latent topics in document summarization. ...
arXiv:2112.09572v3
fatcat:oqnrrn37ufejtl34j3dgv3my6u
Improved Document Modelling with a Neural Discourse Parser
[article]
2019
arXiv
pre-print
For abstractive summarization, for instance, conventional neural models simply match source documents and the summary in a latent space without explicit representation of text structure or relations. ...
In this paper, we propose to use neural discourse representations obtained from a rhetorical structure theory (RST) parser to enhance document representations. ...
Unlike extractive summarization, abstractive summarization has the ability to create new sentences that are not in the original document; it is closer to how humans summarize, in that it generates paraphrases ...
arXiv:1911.06919v1
fatcat:zhd6e2rk4bf7tialzow5kcu754
Investigating Entropy for Extractive Document Summarization
[article]
2021
arXiv
pre-print
Ergo, informativeness is the prime attribute of document summary generated by an algorithm, and selecting sentences that capture the essence of a document is the primary goal of extractive document summarization ...
We present an information theoretic interpretation of the computed entropy, which is the bedrock of the proposed E-Summ algorithm, an unsupervised method for extractive document summarization. ...
of neural summarization methods. ...
arXiv:2109.10886v1
fatcat:32ka7ezyuzchndduv2khbwpgaa
Conditional Neural Generation using Sub-Aspect Functions for Extractive News Summarization
[article]
2020
arXiv
pre-print
Much progress has been made in text summarization, fueled by neural architectures using large-scale training corpora. ...
These results suggest that a more flexible neural summarization framework providing more control options could be desirable in tailoring to different user preferences, which is useful since it is often ...
Conclusion We proposed a neural framework for conditional extractive news summarization. ...
arXiv:2004.13983v3
fatcat:ygong56shvfrnixziiusd4xjpi
Extractive Text Summarization for Social News using Hybrid Techniques in Opinion Mining
2020
International Journal of Engineering and Advanced Technology
The existing work recommends a technique of hybrid text summarization that's a blend of CRF (conditional random fields) and LSA (Latent Semantic Analysis) which being highly adhesive with low redundant ...
The technique of LSA extracts hidden semantic structures within words/sentences that being commonly utilized in the process of summarization. ...
Kuan-Yu Chen [11] discusses Extractive Broadcast News Summarization Leveraging Recurrent NN (Neural Network). ...
doi:10.35940/ijeat.b3356.029320
fatcat:7vwnlgsef5arpllozo24oraotu
Review and Comparative Analysis of Topic Identification Techniques
2019
International Journal of Advanced Trends in Computer Science and Engineering
Topic identification is an area of data mining that finds common text/ themes from several documents. It is a data summarization technique that helps to summarize documents. ...
Existing solutions include text clustering, latent semantic approach, probabilistic latent semantics approach, latent Dirichlet allocation approach, association rule-based approaches, document clustering ...
Xingxing Zhang et al. (2018) proposed a latent variable extractive model based on a neural extractive summarization model and sentence compression model. ...
doi:10.30534/ijatcse/2019/71832019
fatcat:g46lyzxg7jcehlci4r62nxbtpe
StructSum: Summarization via Structured Representations
[article]
2021
arXiv
pre-print
To this end, we propose incorporating latent and explicit dependencies across sentences in the source document into end-to-end single-document summarization models. ...
Our framework complements standard encoder-decoder summarization models by augmenting them with rich structure-aware document representations based on implicitly learned (latent) structures and externally-derived ...
Conclusion and Future Work To summarize, our contributions are three-fold. We propose a framework for incorporating latent and explicit document structure in neural abstractive summarization. ...
arXiv:2003.00576v2
fatcat:mj5rc4wlozfqzjat4dzrn67iru
Text Summary Generation Techniques
2020
VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE
Pattern Recognition is pertinent field in autonomous text summarization for extraction of features from relative and non relative text documents. ...
Here we provide empirical evidence that the method of Deep learning using RNN outperforms various techniques in terms of speed as well as metrics in abstractive summarization of multi-modal documents. ...
Recurrent neural network Extractive Text-Image Summarization uses a neuralbased multi-modal summarization method based on multimodal RNN [4] . ...
doi:10.35940/ijitee.g1016.0597s20
fatcat:lzggaegp25dk3pzuh4gmfoncru
Topic-Guided Abstractive Multi-Document Summarization
[article]
2021
arXiv
pre-print
Moreover, we employ a neural topic model to jointly discover latent topics that can act as cross-document semantic units to bridge different documents and provide global information to guide the summary ...
A critical point of multi-document summarization (MDS) is to learn the relations among various documents. ...
Related Work Multi-document summarization is a challenging subtask of text summarization with a long history. Many previous methods are extractive partly due to the lack of sufficient training data. ...
arXiv:2110.11207v1
fatcat:wf6vudj6sfgcbkwbaabkccwn6m
Deep Recurrent Generative Decoder for Abstractive Text Summarization
2017
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Neural variational inference is employed to address the intractable posterior inference for the recurrent latent variables. ...
Latent structure information implied in the target summaries is learned based on a recurrent latent random model for improving the summarization quality. ...
However, they only use the latent topic information to conduct the sentence salience estimation for extractive summarization. ...
doi:10.18653/v1/d17-1222
dblp:conf/emnlp/LiLBW17
fatcat:cxqckigu25ghfapr53524ebu5m
Deep Recurrent Generative Decoder for Abstractive Text Summarization
[article]
2017
arXiv
pre-print
Neural variational inference is employed to address the intractable posterior inference for the recurrent latent variables. ...
Latent structure information implied in the target summaries is learned based on a recurrent latent random model for improving the summarization quality. ...
However, they only use the latent topic information to conduct the sentence salience estimation for extractive summarization. ...
arXiv:1708.00625v1
fatcat:sibi26obmndubber7sknym4lgi
Jointly Extracting and Compressing Documents with Summary State Representations
[article]
2019
arXiv
pre-print
We present a new neural model for text summarization that first extracts sentences from a document and then compresses them. ...
In addition, our model dynamically determines the length of the output summary based on the gold summaries it observes during training and does not require length constraints typical to extractive summarization ...
SummaRuNNer: a recurrent neural network
based sequence model for extractive summarization
of documents. ...
arXiv:1904.02020v2
fatcat:hlpnpgtzendlxiwjpz5bj5o7ze
« Previous
Showing results 1 — 15 out of 13,577 results