Filters








540,930 Hits in 9.8 sec

Commentary on "Learning with Contrastive Examples for Data-to-Text Generation"

Yui Uehara, Tatsuya Ishigaki
2021 Journal of Natural Language Processing  
"Controlling Contents in Data-to-Document Generation with Human-Designed Topic Labels." In Proceedings of the 12th International Conference on Natural Language Generation (INLG2019), pp. 323-332.  ...  "Generating Market Comments Referring to External Resources." In Proceedings of the 11th International Conference on Natural Language Generation (INLG2018), pp. 135-139.  ... 
doi:10.5715/jnlp.28.710 fatcat:h3vlien6ivbcjdvvw7bygxsfie

Learning with Contrastive Examples for Data-to-Text Generation

Yui Uehara, Tatsuya Ishigaki, Kasumi Aoki, Hiroshi Noji, Keiichi Goshima, Ichiro Kobayashi, Hiroya Takamura, Yusuke Miyao
2020 Proceedings of the 28th International Conference on Computational Linguistics   unpublished
Existing models for data-to-text tasks generate fluent but sometimes incorrect sentences e.g., "Nikkei gains" is generated when "Nikkei drops" is expected.  ...  We then use learning methods with several losses that exploit contrastive examples.  ...  Conclusion We presented learning methods with several losses that exploited contrastive examples for data-to-text. The results showed our methods improved the performances in terms of correctness.  ... 
doi:10.18653/v1/2020.coling-main.213 fatcat:xraq3acxwnf5zodorftbm6lo5a

Text Data Augmentation for Deep Learning

Connor Shorten, Taghi M. Khoshgoftaar, Borko Furht
2021 Journal of Big Data  
with transfer and multi-task learning, and ideas for AI-GAs (AI-Generating Algorithms).  ...  We follow these motifs with a concrete list of augmentation frameworks that have been developed for text data.  ...  Acknowledgements We would like to thank the reviewers in the Data Mining and Machine Learning Laboratory at Florida Atlantic University.  ... 
doi:10.1186/s40537-021-00492-0 fatcat:bcbaqkpicnd6dcwc34pdijosby

Improving BERT Model Using Contrastive Learning for Biomedical Relation Extraction [article]

Peng Su, Yifan Peng, K. Vijay-Shanker
2021 arXiv   pre-print
However, contrastive learning is not widely utilized in natural language processing due to the lack of a general method of data augmentation for text data.  ...  In this work, we explore the method of employing contrastive learning to improve the text representation from the BERT model for relation extraction.  ...  Also, our data augmentation for contrastive learning needs SDP between two given entities, so we need to construct the augmented dataset with the entities mentioned in the text.  ... 
arXiv:2104.13913v1 fatcat:ojrea6gc7rhqdk7bwmovd62whq

Text Transformations in Contrastive Self-Supervised Learning: A Review [article]

Amrita Bhattacharjee, Mansooreh Karami, Huan Liu
2022 arXiv   pre-print
Finally, we describe some challenges and potential directions for learning better text representations using contrastive methods.  ...  evaluations for contrastive representation learning in NLP.  ...  Latent-Space Transformations Techniques proposed for standard data augmentation in lowresource learning settings can also be used to generate positive samples for contrastive representation learning in  ... 
arXiv:2203.12000v2 fatcat:7mm2w7jt4zbptmsqoaouxrjlty

CL4AC: A Contrastive Loss for Audio Captioning [article]

Xubo Liu, Qiushi Huang, Xinhao Mei, Tom Ko, H Lilian Tang, Mark D. Plumbley, Wenwu Wang
2021 arXiv   pre-print
of latent representation and the alignment between audio and texts, while trained with limited data.  ...  a decoder is used to generate the captions.  ...  For example, AAC can be used for generating subtitles for the audio content in a television program, or for generating text descriptions of audio to help the hearing impaired in accessing audio content  ... 
arXiv:2107.09990v3 fatcat:qg2kf33xs5dsvoeleszx4okkcy

Improving Health Mentioning Classification of Tweets using Contrastive Adversarial Training [article]

Pervaiz Iqbal Khan, Shoaib Ahmed Siddiqui, Imran Razzak, Andreas Dengel, Sheraz Ahmed
2022 arXiv   pre-print
Learning the context of the input text is the key to this problem.  ...  The idea is to learn word representation by its surrounding words and utilize emojis in the text to help improve the classification results.  ...  In our work, we generate adversarial examples using one-step FGSM and perform contrastive learning with clean examples to learn the representations for the input examples. B.  ... 
arXiv:2203.01895v1 fatcat:s2mrlsvjx5csxoxmvw4o6snn4u

Constructing Contrastive samples via Summarization for Text Classification with limited annotations [article]

Yangkai Du, Tengfei Ma, Lingfei Wu, Fangli Xu, Xuhong Zhang, Bo Long, Shouling Ji
2021 arXiv   pre-print
We use these samples for supervised contrastive learning to gain better text representations which greatly benefit text classification tasks with limited annotations.  ...  In this paper, we propose a novel approach to construct contrastive samples for language tasks using text summarization.  ...  data for contrastive learning.  ... 
arXiv:2104.05094v3 fatcat:btjqtmpspveb5in2dajz6yltqq

Improved Text Classification via Contrastive Adversarial Training [article]

Lin Pan, Chung-Wei Hang, Avirup Sil, Saloni Potdar
2022 arXiv   pre-print
We propose a simple and general method to regularize the fine-tuning of Transformer-based encoders for text classification tasks.  ...  Specifically, during fine-tuning we generate adversarial examples by perturbing the word embeddings of the model and perform contrastive learning on clean and adversarial examples in order to teach the  ...  In our work, we use the simpler one-step FGSM to generate perturbed examples and perform contrastive learning with clean examples.  ... 
arXiv:2107.10137v2 fatcat:jcp7wkmorvah7doqoorvqz6h3q

Incorporating Hierarchy into Text Encoder: a Contrastive Learning Approach for Hierarchical Text Classification [article]

Zihan Wang, Peiyi Wang, Lianzhe Huang, Xin Sun, Houfeng Wang
2022 arXiv   pre-print
By pulling together the input text and its positive sample, the text encoder can learn to generate the hierarchy-aware text representation independently.  ...  Instead of modeling them separately, in this work, we propose Hierarchy-guided Contrastive Learning (HGCLR) to directly embed the hierarchy into a text encoder.  ...  Acknowledgements We thank all the anonymous reviewers for their constructive feedback.  ... 
arXiv:2203.03825v2 fatcat:yqwfnql745eibowouhdxhi44hq

Improving Text-to-Image Synthesis Using Contrastive Learning [article]

Hui Ye, Xiulong Yang, Martin Takac, Rajshekhar Sunderraman, Shihao Ji
2021 arXiv   pre-print
The goal of text-to-image synthesis is to generate a visually realistic image that matches a given text description.  ...  In the pretraining stage, we utilize the contrastive learning approach to learn the consistent textual representations for the captions corresponding to the same image.  ...  We would also gratefully acknowledge the support of VMware Inc. for its university research fund to this research.  ... 
arXiv:2107.02423v2 fatcat:dyqfu3qpf5cilajxqzvtbigkje

CoNT: Contrastive Neural Text Generation [article]

Chenxin An, Jiangtao Feng, Kai Lv, Lingpeng Kong, Xipeng Qiu, Xuanjing Huang
2022 arXiv   pre-print
We validate CoNT on five generation tasks with ten benchmarks, including machine translation, summarization, code comment generation, data-to-text generation and commonsense generation.  ...  However, previous methods using contrastive learning in neural text generation usually lead to inferior performance.  ...  C.5 Data-to-text Generation The input of data-to-text generation tasks is structured data (e.g., table, graph).  ... 
arXiv:2205.14690v1 fatcat:c37e2qw62ba5nlllg45lkvfbye

CERT: Contrastive Self-supervised Learning for Language Understanding [article]

Hongchao Fang, Sicheng Wang, Meng Zhou, Jiayuan Ding, Pengtao Xie
2020 arXiv   pre-print
To address this issue, we propose CERT: Contrastive self-supervised Encoder Representations from Transformers, which pretrains language representation models using contrastive self-supervised learning  ...  The data and code are available at https://github.com/UCSD-AI4H/CERT  ...  The basic idea of contrastive SSL is: generate augmented examples of original data examples, create a predictive task where the goal is to predict whether two augmented examples are from the same original  ... 
arXiv:2005.12766v2 fatcat:4vqsgvohbvgfxfczwechlb5rhu

Improved Text Classification via Contrastive Adversarial Training

Lin Pan, Chung-Wei Hang, Avirup Sil, Saloni Potdar
2022 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We propose a simple and general method to regularize the fine-tuning of Transformer-based encoders for text classification tasks.  ...  to the baseline trained with full training data.  ...  We then introduce our method of generating adversarial examples and propose our method CAT that uses these examples to perform contrastive learning with clean examples.  ... 
doi:10.1609/aaai.v36i10.21362 fatcat:aq7kojibszgjxohkuy5yf4ohfq

CLLD: Contrastive Learning with Label Distance for Text Classification [article]

Jinhe Lan, Qingyuan Zhan, Chenhao Jiang, Kunping Yuan, Desheng Wang
2022 arXiv   pre-print
Inspired by recent advances in contrastive learning, we specifically design a classification method with label distance for learning contrastive classes.  ...  To address this problem, we propose a novel Contrastive Learning with Label Distance (CLLD) in this work.  ...  We would like to thank colleagues of our team for discussion and providing useful feedback on the project.  ... 
arXiv:2110.13656v3 fatcat:mdggcqwm5fbjbceazqzg3pfevm
« Previous Showing results 1 — 15 out of 540,930 results