3 Hits in 1.3 sec

The Code2Text Challenge: Text Generation in Source Libraries

Kyle Richardson, Sina Zarrieß, Jonas Kuhn
2017 Proceedings of the 10th International Conference on Natural Language Generation  
We propose a new shared task for tactical data-to-text generation in the domain of source code libraries.  ...  Specifically, we focus on text generation of function descriptions from example software projects.  ...  In addition, we acknowledge support by the Cluster of Excellence "Cognitive Interaction Technology" (CITEC; EXC 277) at Bielefeld University, which is also funded by the DFG.  ... 
doi:10.18653/v1/w17-3516 dblp:conf/inlg/RichardsonZK17 fatcat:o65ffwtqynesrnyidgbwt5jzgu


Christian Drumm, Matthias Schmitt, Hong-Hai Do, Erhard Rahm
2007 Proceedings of the sixteenth ACM conference on Conference on information and knowledge management - CIKM '07  
A common task in many database applications is the migration of legacy data from multiple sources into a new one.  ...  This requires identifying semantically related elements of the source and target systems and the creation of mapping expressions to transform instances of those elements from the source format to the target  ...  If the source element text is found in the the code list associated to the target schema element, the corresponding code is used; if not a translation table is necessary ( e.g.  ... 
doi:10.1145/1321440.1321458 dblp:conf/cikm/DrummSDR07 fatcat:wrjtjzlgjzd45hxc6f632il7vq

Recent Advances in Neural Text Generation: A Task-Agnostic Survey [article]

Chen Tang, Frank Guerin, Yucheng Li, Chenghua Lin
2022 arXiv   pre-print
The challenge is to generate natural human-like text, and to control the generation process. This paper presents a task-agnostic survey of recent advances in neural text generation.  ...  In recent years much effort has been devoted to applying neural models to the task of natural language generation.  ...  The Code2Text challenge: Text generation in source libraries. In Proceedings of the 10th International Conference on Natural Language Generation. What we know about how BERT works.  ... 
arXiv:2203.03047v1 fatcat:iupgvcw2hbge5ioy6quiotnra4