Online multi-label learning with accelerated nonsmooth stochastic gradient descent
2013 IEEE International Conference on Acoustics, Speech and Signal Processing
Multi-label learning refers to methods for learning a set of functions that assigns a set of relevant labels to each instance. One of popular approaches to multi-label learning is label ranking, where a set of ranking functions are learned to order all the labels such that relevant labels are ranked higher than irrelevant ones. Rank-SVM is a representative method for label ranking where ranking loss is minimized in the framework of max margin. However, the dual form in Rank-SVM involves a
... VM involves a quadratic programming which is generally solved in cubic time in the size of training data. The primal form is appealing for the development of online learning but involves a nonsmooth convex loss function. In this paper we present a method for online multi-label learning where we minimize the primal form using the accelerated nonsmooth stochastic gradient descent which has been recently developed to extend Nesterov's smoothing method to the stochastic setting. Numerical experiments on several large-scale datasets demonstrate the computational efficiency and fast convergence of our proposed method, compared to existing methods including subgradient-based algorithms.