Filters








63,027 Hits in 6.0 sec

Order-Planning Neural Text Generation From Structured Data [article]

Lei Sha, Lili Mou, Tianyu Liu, Pascal Poupart, Sujian Li, Baobao Chang, Zhifang Sui
2017 arXiv   pre-print
Generating texts from structured data (e.g., a table) is important for various natural language processing tasks such as question answering and dialog systems.  ...  However, these neural network-based approaches do not model the order of contents during text generation.  ...  Acknowledgments We thank Jing He from AdeptMind.ai for helpful discussions on different ways of using field information.  ... 
arXiv:1709.00155v1 fatcat:ocrvapdk6bgj7idbcfd3qauc5y

The Natural Language Pipeline, Neural Text Generation and Explainability

Juliette Faille, Albert Gatt, Claire Gardent
2020 Zenodo  
End-to-end encoder-decoder approaches to data-to-text generation are often black boxes whose predictions are difficult to explain.  ...  The traditional pre-neural Natural Language Generation (NLG) pipeline provides a framework for breaking up the end-to-end encoder-decoder.  ...  This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 860621.  ... 
doi:10.5281/zenodo.5887676 fatcat:pte7efuxpvdgzbxg4tn432gkqm

Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation [article]

Amit Moryossef, Yoav Goldberg, Ido Dagan
2019 arXiv   pre-print
Data-to-text generation can be conceptually divided into two parts: ordering and structuring the information (planning), and generating fluent language describing the information (realization).  ...  Our results demonstrate that decoupling text planning from neural realization indeed improves the system's reliability and adequacy while maintaining fluent output.  ...  Conclusion We proposed adding an explicit symbolic planning component to a neural data-to-text NLG system, which eases the burden on the neural component concerning text structuring and fact tracking.  ... 
arXiv:1904.03396v2 fatcat:p54apfdohbcwhatcajtsx5v4ju

Improving Quality and Efficiency in Plan-based Neural Data-to-text Generation

Amit Moryossef, Yoav Goldberg, Ido Dagan
2019 Proceedings of the 12th International Conference on Natural Language Generation  
We follow the step-by-step approach to neural data-to-text generation we proposed in Moryossef et al. (2019) , in which the generation process is divided into a text-planning stage followed by a plan-realization  ...  We suggest four extensions to that framework: (1) we introduce a trainable neural planning component that can generate effective plans several orders of magnitude faster than the original planner; (2)  ...  Acknowledgements This work was supported in part by the German Research Foundation through the German-Israeli Project Cooperation (DIP, grant DA 1600/1-1) and by a grant from Reverso and Theo Hoffenberg  ... 
doi:10.18653/v1/w19-8645 dblp:conf/inlg/MoryossefGD19 fatcat:7gudhkre2vc4dj5hfe74zyvasa

Improving Quality and Efficiency in Plan-based Neural Data-to-Text Generation [article]

Amit Moryossef, Ido Dagan, Yoav Goldberg
2019 arXiv   pre-print
We follow the step-by-step approach to neural data-to-text generation we proposed in Moryossef et al (2019), in which the generation process is divided into a text-planning stage followed by a plan-realization  ...  We suggest four extensions to that framework: (1) we introduce a trainable neural planning component that can generate effective plans several orders of magnitude faster than the original planner; (2)  ...  Acknowledgements This work was supported in part by the German Research Foundation through the German-Israeli Project Cooperation (DIP, grant DA 1600/1-1) and by a grant from Reverso and Theo Hoffenberg  ... 
arXiv:1909.09986v1 fatcat:au7oeap2wvdzrdjhnntwsf7yge

Neural data-to-text generation: A comparison between pipeline and end-to-end architectures

Thiago Castro Ferreira, Chris van der Lee, Emiel van Miltenburg, Emiel Krahmer
2019 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)  
This study introduces a systematic comparison between neural pipeline and endto-end data-to-text approaches for the generation of text from RDF triples.  ...  By contrast, recent neural models for data-to-text generation have been proposed as end-to-end approaches, where the non-linguistic input is rendered in natural language with much less explicit intermediate  ...  Moryossef et al. (2019) proposed an approach which converts an RDF triple set into text in two steps: text planning, a non-neural method where the input will be ordered and structured, followed by a neural  ... 
doi:10.18653/v1/d19-1052 dblp:conf/emnlp/FerreiraLMK19 fatcat:63otuy4hljfkbccauxbru2wgom

Neural data-to-text generation: A comparison between pipeline and end-to-end architectures [article]

Thiago Castro Ferreira, Chris van der Lee, Emiel van Miltenburg, Emiel Krahmer
2019 arXiv   pre-print
This study introduces a systematic comparison between neural pipeline and end-to-end data-to-text approaches for the generation of text from RDF triples.  ...  In contrast, recent neural models for data-to-text generation have been proposed as end-to-end approaches, where the non-linguistic input is rendered in natural language with much less explicit intermediate  ...  Moryossef et al. (2019) proposed an approach which converts an RDF triple set into text in two steps: text planning, a non-neural method where the input will be ordered and structured, followed by a neural  ... 
arXiv:1908.09022v2 fatcat:r5ijnub4snchjgwtcnojfferjy

Neural Data-to-Text Generation with Dynamic Content Planning [article]

Kai Chen, Fayuan Li, Baotian Hu, Weihua Peng, Qingcai Chen, Hong Yu
2020 arXiv   pre-print
To alleviate these problems, we propose a Neural data-to-text generation model with Dynamic content Planning, named NDP for abbreviation.  ...  The NDP can utilize the previously generated text to dynamically select the appropriate entry from the given structured data.  ...  Our work aims to energize the neural data-totext generation model to dynamically select appropriate content from the given structured data with a novel dynamic planning mechanism.  ... 
arXiv:2004.07426v2 fatcat:pq3a5srnare7hl4g2orh4g6o6e

Plan-then-Generate: Controlled Data-to-Text Generation via Planning [article]

Yixuan Su, David Vandyke, Sihui Wang, Yimai Fang, Nigel Collier
2021 arXiv   pre-print
In this study, we propose a novel Plan-then-Generate (PlanGen) framework to improve the controllability of neural data-to-text models.  ...  Recent developments in neural networks have led to the advance in data-to-text generation.  ...  The webnlg challenge: Generating text from dbpedia data.  ... 
arXiv:2108.13740v1 fatcat:i5v2p43wt5bjritpk7jmviu5m4

Text Style Transfer via Learning Style Instance Supported Latent Space

Xiaoyuan Yi, Zhenghao Liu, Wenhao Li, Maosong Sun
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
Instead of representing styles with embeddings or latent variables learned from single sentences, our model leverages the generative flow technique to extract underlying stylistic properties from multiple  ...  Nonetheless, the intractability of completely disentangling content from style for text leads to a contradiction of content preservation and style transfer accuracy.  ...  Most recent works use end-to-end neural networks to generate textual descriptions directly from the input data or focus on exploiting the structure of data for better representation [Mahapatra et al.,  ... 
doi:10.24963/ijcai.2020/522 dblp:conf/ijcai/BaiLDSZ20 fatcat:y6hfvow67fef5dmvrwgyz7puuy

GGP: A Graph-based Grouping Planner for Explicit Control of Long Text Generation [article]

Xuming Lin, Shaobo Cui, Zhongzhou Zhao, Wei Zhou, Ji Zhang, Haiqing Chen
2021 arXiv   pre-print
Existing data-driven methods can well handle short text generation.  ...  With these two synergic representations, we then regroup these phrases into a fine-grained plan, based on which we generate the final long text.  ...  Table 1 : 1 An example of structured data, plan and text from the ATG dataset.  ... 
arXiv:2108.07998v1 fatcat:tu4gammhqfglrmfl63zigymiha

Data-to-Text Generation with Content Selection and Planning

Ratish Puduppully, Li Dong, Mirella Lapata
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Recent advances in data-to-text generation have led to the use of large-scale datasets and neural network models which are trained end-to-end, without explicitly modeling what to say and in what order.  ...  Given a corpus of data records (paired with descriptive documents), we first generate a content plan highlighting which information should be mentioned and in which order and then generate the document  ...  Our work is closest to recent neural network models which learn generators from data and accompanying text resources.  ... 
doi:10.1609/aaai.v33i01.33016908 fatcat:vyl2y7z6b5frjne2ku4ngcnpni

Data-to-Text Generation with Content Selection and Planning [article]

Ratish Puduppully, Li Dong, Mirella Lapata
2019 arXiv   pre-print
Recent advances in data-to-text generation have led to the use of large-scale datasets and neural network models which are trained end-to-end, without explicitly modeling what to say and in what order.  ...  Given a corpus of data records (paired with descriptive documents), we first generate a content plan highlighting which information should be mentioned and in which order and then generate the document  ...  Our work is closest to recent neural network models which learn generators from data and accompanying text resources.  ... 
arXiv:1809.00582v2 fatcat:ozhaclw3izfwfnri3d3ykxec5e

Text-to-Text Pre-Training for Data-to-Text Tasks [article]

Mihir Kale, Abhinav Rastogi
2021 arXiv   pre-print
Our experiments indicate that text-to-text pre-training in the form of T5, enables simple, end-to-end transformer based models to outperform pipelined neural architectures tailored for data-to-text generation  ...  We study the pre-train + fine-tune strategy for data-to-text tasks.  ...  procedure into a planning stage followed by a neural generation stage. • Pipeline-Transformer (Ferreira et al., 2019), a pipelined neural system consisting of discourse ordering, text structuring, lexicalization  ... 
arXiv:2005.10433v3 fatcat:cb7xh7zfg5hqnafca6q5etn6my

SPATIAL PLANNING TEXT INFORMATION PROCESSING WITH USE OF MACHINE LEARNING METHODS

I. Kaczmarek, A. Iwaniak, A. Iwaniak, A. Świetlicka, A. Świetlicka, M. Piwowarczyk, M. Piwowarczyk, F. Harvey
2020 ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences  
problem of heterogeneous planning information authors used k-means algorithm and artificial neural networks.  ...  Each of the planning areas has a symbol and a category of land use, which is different in each of the plans.  ...  From the supervised learning methods we decided to focus on artificial neural networks, which are well known from their advantages like: generalization ability, prediction of output data based on input  ... 
doi:10.5194/isprs-annals-vi-4-w2-2020-95-2020 doaj:b7066ad1528a46dd8760ab528fb6876c fatcat:6urifgflnrbl5peev7jtrgmqkm
« Previous Showing results 1 — 15 out of 63,027 results