Learning with multiple representations : algorithms and applications [thesis]

Xinxing Xu
Recently, lots of visual representations have been developed for computer vision applications. As different types of visual representations may reflect different kinds of information about the original data, their differentiation ability may vary greatly. As the existing machine learning algorithms are mostly based on the single data representation, it becomes more and more important to develop machine learning algorithms for tackling data with multiple representations. Therefore, in this
more » ... fore, in this thesis we study the problem of learning with multiple representations. We develop several novel algorithms to tackle data with multiple representations under three different learning scenarios, and we apply the proposed algorithms to a few computer vision applications. Specifically, we first study the learning with multiple kernels under fully supervised setting. Based on a hard margin perspective for the dual form of the traditional ℓ 1 -norm Multiple Kernel Learning (MKL), we introduce a new "kernel slack variable" and propose a Soft Margin framework for Multiple Kernel Learning (SMMKL). By incorporating the hinge loss for kernel slack variables, a new box constraint for the kernel coefficients is introduced for Multiple Kernel Learning. The square hinge loss and the square loss soft margin MKLs naturally incorporate the family of elastic-net MKL and ℓ 2 MKL, respectively. We demonstrate the effectiveness of our proposed algorithms on benchmark data sets as well as several computer vision data sets. Second, we study the learning with multiple kernels for weakly labeled data. Based on "input-output kernels", we propose a unified Input-output Kernel Learning (IOKL) framework for handling weakly labeled data with multiple representations. Under this framework, the general data ambiguity problems such as SSL, MIL and clustering with multiple representations are solved in a unified framework. We formulate the learning problem as a group sparse MKL problem to incorporate the intrinsic group structure among the input-output kernels. A group sparse soft margin regularization is
doi:10.32657/10356/62188 fatcat:gixhz627mvajnguy7e52ln2ncy