Margin maximization with feed-forward neural networks: a comparative study with SVM and AdaBoost

Enrique Romero, Lluı́s Màrquez, Xavier Carreras
2004 Neurocomputing  
Feed-forward Neural Networks (FNN) and Support Vector Machines (SVM) are two machine learning frameworks developed from very di erent starting points of view. In this work a new learning model for FNN is proposed such that, in the linearly separable case, it tends to obtain the same solution as SVM. The key idea of the model is a weighting of the sum-of-squares error function, which is inspired by the AdaBoost algorithm. As in SVM, the hardness of the margin can be controlled, so that this
more » ... can be also used for the non-linearly separable case. In addition, it is not restricted to the use of kernel functions, and it allows to deal with multiclass and multilabel problems as FNN usually do. Finally, it is independent of the particular algorithm used to minimize the error function. Theoretic and experimental results on synthetic and real-world problems are shown to conÿrm these claims. Several empirical comparisons among this new model, SVM, and AdaBoost have been made in order to study the agreement between the predictions made by the respective classiÿers. Additionally, the results obtained show that similar performance does not imply similar predictions, suggesting that di erent models can be combined leading to better performance.
doi:10.1016/j.neucom.2003.10.011 fatcat:umszhcqwz5dzzmutcnxbd3gvte