Generalization Bounds Derived IPM-Based Regularization for Domain Adaptation

Juan Meng, Guyu Hu, Dong Li, Yanyan Zhang, Zhisong Pan
2016 Computational Intelligence and Neuroscience  
Domain adaptation has received much attention as a major form of transfer learning. One issue that should be considered in domain adaptation is the gap between source domain and target domain. In order to improve the generalization ability of domain adaption methods, we proposed a framework for domain adaptation combining source and target data, with a new regularizer which takes generalization bounds into account. This regularization term considers integral probability metric (IPM) as the
more » ... nce between the source domain and the target domain and thus can bound up the testing error of an existing predictor from the formula. Since the computation of IPM only involves two distributions, this generalization term is independent with specific classifiers. With popular learning models, the empirical risk minimization is expressed as a general convex optimization problem and thus can be solved effectively by existing tools. Empirical studies on synthetic data for regression and real-world data for classification show the effectiveness of this method.
doi:10.1155/2016/7046563 pmid:26819589 pmcid:PMC4707017 fatcat:6tmzzozp6zbavp4kocb37ui2bu