Neural Structural Correspondence Learning for Domain Adaptation

Yftah Ziser, Roi Reichart
2017 Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)  
We introduce a neural network model that marries together ideas from two prominent strands of research on domain adaptation through representation learning: structural correspondence learning (SCL, (Blitzer et al., 2006)) and autoencoder neural networks (NNs). Our model is a three-layer NN that learns to encode the non-pivot features of an input example into a lowdimensional representation, so that the existence of pivot features (features that are prominent in both domains and convey useful
more » ... ormation for the NLP task) in the example can be decoded from that representation. The low-dimensional representation is then employed in a learning algorithm for the task. Moreover, we show how to inject pre-trained word embeddings into our model in order to improve generalization across examples with similar pivot features. We experiment with the task of cross-domain sentiment classification on 16 domain pairs and show substantial improvements over strong baselines. 1
doi:10.18653/v1/k17-1040 dblp:conf/conll/ZiserR17 fatcat:hvkpr7vulvhibcjbshc5toeqny