DoCoGen: Domain Counterfactual Generation for Low Resource Domain Adaptation

Nitay Calderon, Eyal Ben-David, Amir Feder, Roi Reichart
2022 Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)   unpublished
Natural language processing (NLP) algorithms have become very successful, but they still struggle when applied to out-of-distribution examples. In this paper we propose a controllable generation approach in order to deal with this domain adaptation (DA) challenge. Given an input text example, our DoCoGen algorithm generates a domain-counterfactual textual example (D-CON) -that is similar to the original in all aspects, including the task label, but its domain is changed to a desired one.
more » ... ntly, DoCoGen is trained using only unlabeled examples from multiple domainsno NLP task labels or parallel pairs of textual examples and their domain-counterfactuals are required. We show that DoCoGen can generate coherent counterfactuals consisting of multiple sentences. We use the D-CONs generated by DoCoGen to augment a sentiment classifier and a multi-label intent classifier in 20 and 78 DA setups, respectively, where source-domain labeled data is scarce. Our model outperforms strong baselines and improves the accuracy of a state-of-the-art unsupervised DA algorithm. 1
doi:10.18653/v1/2022.acl-long.533 fatcat:lkukck3wn5d4hdcbn6xidk7smm