Filters








67,636 Hits in 3.3 sec

Kernel Design Using Boosting

Koby Crammer, Joseph Keshet, Yoram Singer
2002 Neural Information Processing Systems  
We cast the kernel design problem as the construction of an accurate kernel from simple (and less accurate) base kernels. We use the boosting paradigm to perform the kernel construction process.  ...  On the USPS dataset, the performance of the Perceptron algorithm with learned kernels is systematically better than a fixed RBF kernel.  ...  Discussion In this paper we showed how to use the boosting framework to design kernels.  ... 
dblp:conf/nips/CrammerKS02 fatcat:zomtrwx7orgydecil75wjd5sve

Multimodal kernel learning for image retrieval

Yen-Yu Lin, Chiou-Shann Fuh
2010 2010 International Conference on System Science and Engineering  
maintained for each modality; 3) The adopted optimization criterion in boosting is to align with a target kernel matrix accounting for relevance feedback, and the learned multimodal kernel matrix can  ...  via boosting; 2) The base kernel matrices are derived from eigendecomposing the graph Laplacian, and further refined to satisfy a pivotal monotone property that ensures intrinsic structure will be reasonably  ...  Indeed boosting with our design of monotone base kernel (MBK) matrices plays the central role of our approach.  ... 
doi:10.1109/icsse.2010.5551790 fatcat:mxoes5xy75fulnt6mh77dqgus4

Linear kernel combination using boosting

Alexis Lechervy, Philippe Henri Gosselin, Frédéric Precioso
2012 The European Symposium on Artificial Neural Networks  
In this paper, we propose a novel algorithm to design multiclass kernels based on an iterative combination of weak kernels in a schema inspired from the boosting framework.  ...  We evaluate our method for classification on a toy example by integrating our multi-class kernel into a kNN classifier and comparing our results with a reference iterative kernel design method.  ...  In this paper, we propose to design a linear combination of weak base kernels using the boosting paradigm, similarly to [2] .  ... 
dblp:conf/esann/LechervyGP12 fatcat:yalln24i5falpjix4ekpsarr74

Boosted Multiple Kernel Learning for First-Person Activity Recognition [article]

Fatih Ozkan, Mehmet Ali Arabaci, Elif Surer, Alptekin Temizel
2017 arXiv   pre-print
Our experimental results show that use of Multiple Kernel Learning (MKL) and Boosted MKL in first-person activity recognition problem exhibits improved results in comparison to the state-of-the-art.  ...  An effective activity recognition system requires selection and use of complementary features and appropriate kernels for each feature.  ...  Boosted Multiple Kernel Learning Boosted Multiple Kernel Learning (Boosted MKL) is an iterative approach to combine features and kernels effectively.  ... 
arXiv:1702.06799v2 fatcat:4kdn7ymdgrfu5krnhywlkom4zy

Boosting kernel combination for multi-class image categorization

Alexis Lechervy, Philippe-Henri Gosselin, Frederic Precioso
2012 2012 19th IEEE International Conference on Image Processing  
In this paper, we propose a novel algorithm to design multiclass kernel functions based on an iterative combination of weak kernels in a scheme inspired from boosting framework.  ...  We propose to design a linear combination of base kernels using the boosting paradigm, similarly to [6] . However, we focus on a strategy for multi-class learning using many different features.  ...  LINEAR KERNEL COMBINATION USING BOOSTING Linear combination The aim of this paper is to design a kernel function K(., .) as a linear combination of base kernel functions k t (., .): K T (x i , x j )  ... 
doi:10.1109/icip.2012.6467254 dblp:conf/icip/LechervyGP12 fatcat:ktp7ptzwwre37pn3haqxme3j4u

Experiments With Repeating Weighted Boosting Search for Optimization in Signal Processing Applications

S. Chen, X.X. Wang, C.J. Harris
2005 IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)  
The paper proposes a guided global search optimization technique, referred to as the repeated weighted boosting search.  ...  Comparison is made with the two better known and widely used guided global search techniques, known as the genetic algorithm and adaptive simulated annealing, in terms of the requirements for algorithmic  ...  This subsection reports an alternative kernel classifier design approach that incrementally constructs a sparse kernel classifier using the RWBS algorithm [42] .  ... 
doi:10.1109/tsmcb.2005.845398 pmid:16128453 fatcat:54n6fzvgkndstm4dvxxdtcitju

Boosted Multiple Kernel Learning For First-Person Activity Recognition

Mehmet Arabaci, Fatih Ozkan, Elif Surer, Alptekin Temizel
2018 Zenodo  
Boosted Multiple Kernel Learning Boosted Multiple Kernel Learning (Boosted MKL) is an iterative approach to combine features and kernels effectively.  ...  MULTIPLE KERNEL LEARNING In this study, we use both Multiple Kernel Learning (MKL) and Boosted MKL, whose details are explained in this section. A.  ... 
doi:10.5281/zenodo.1159451 fatcat:ksr2frb7xfas3gkcqhosxlewbu

From Kernel Machines to Ensemble Learning [article]

Chunhua Shen, Fayao Liu
2014 arXiv   pre-print
This finding not only enables us to design new ensemble learning methods directly from kernel methods, but also makes it possible to take advantage of those highly-optimized fast linear SVM solvers for  ...  Unlike previous studies showing the equivalence between boosting and support vector machines (SVMs), which needs a translation procedure, we show that it is possible to design boosting-like procedure to  ...  RFF is designed by using the fact that a shift-invariant kernel is the Fourier transform of a non-negative measure.  ... 
arXiv:1401.0767v1 fatcat:extsje6t4rdjdco2isrjle27we

Boosted kernel for image categorization

Alexis Lechervy, Philippe-Henri Gosselin, Frédéric Precioso
2013 Multimedia tools and applications  
In this paper, we propose a framework to learn an effective kernel function using the Boosting paradigm to linearly combine weak kernels.  ...  In such a context, lot of effort has to be put in the design of the kernel functions and underlying high-level features.  ...  It differs from other approaches thanks to choice of rank one weak kernels, and more specifically to the design of the learners target used in the Boosting process.  ... 
doi:10.1007/s11042-012-1328-1 fatcat:zjwabtpcjff43ktzs7qaao7l54

Boosted Multiple Kernel Learning for Scene Category Recognition

I-Hong Jhuo, D.T. Lee
2010 2010 20th International Conference on Pattern Recognition  
., a set of kernels, and transform the discriminant information contained in each kernel into a set of weak learners, called dyadic hypercuts.  ...  Based on this model, we present a novel approach to carrying out incremental multiple kernel learning for feature fusion by applying AdaBoost to the union of the sets of weak learners.  ...  Inspired by the good performance of multiple kernel learning (MKL) [20] , [22] , [14] , [16] , we adopt an adaptive learning approach to designing kernel machines for scene recognition.  ... 
doi:10.1109/icpr.2010.855 dblp:conf/icpr/JhuoL10 fatcat:7z643cqxgff7ndrdfbjpkdm4zq

Boosting as a kernel-based method [article]

Aleksandr Y. Aravkin, Giulio Bottegal, Gianluigi Pillonetto
2017 arXiv   pre-print
The number of boosting iterations is modeled as a continuous hyperparameter, and fit along with other parameters using standard techniques.  ...  In the context of ℓ_2 boosting, we start with a weak linear learner defined by a kernel K.  ...  This motivates the use of the robust 1 loss. Hence, the function is still Boosting+ 1 Boosting kernel+ 1 Gaussian kernel+ 1 76.62 % 76.75 % 75.19 %  ... 
arXiv:1608.02485v2 fatcat:mzhwtmyz2jah7hvmdgeozlcuwi

Sparse incremental regression modeling using correlation criterion with boosting search

S. Chen, X.X. Wang, D.J. Brown
2005 IEEE Signal Processing Letters  
Experimental results obtained using this technique demonstrate that it offers a viable alternative to the existing state-of-the-art kernel modeling methods for constructing parsimonious models.  ...  The optimization at each regression stage is carried out with a simple search algorithm re-enforced by boosting.  ...  Moreover, the regressor kernel variances also need to be decided using some other appropriate techniques. II.  ... 
doi:10.1109/lsp.2004.842250 fatcat:jfp7hrhwtfhi5efc7jign6subq

On boosting kernel regression

Marco Di Marzio, Charles C. Taylor
2008 Journal of Statistical Planning and Inference  
In this paper we propose a simple multistep regression smoother which is constructed in an iterative manner, by learning the Nadaraya-Watson estimator with L 2 boosting.  ...  The first boosting step is analyzed in more detail, giving asymptotic expressions as functions of the smoothing parameter, and relationships with previous work are explored.  ...  We will adopt the experimental design of Hastie and Loader (1993) who considered adaptive kernels for use at the boundary.  ... 
doi:10.1016/j.jspi.2007.10.005 fatcat:eqrmedhul5htnfru4a2avtr2oe

Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies

Stefanie Friedrichs, Juliane Manitz, Patricia Burger, Christopher I. Amos, Angela Risch, Jenny Chang-Claude, Heinz-Erich Wichmann, Thomas Kneib, Heike Bickeböller, Benjamin Hofner
2017 Computational and Mathematical Methods in Medicine  
We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm.  ...  Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios.  ...  Model Prediction Using Kernels. Boosting specifically aims to optimize prediction accuracy.  ... 
doi:10.1155/2017/6742763 pmid:28785300 pmcid:PMC5530424 fatcat:cgfcskkjc5honnidp2xmxqvfwe

Prediction of Top Tourist Attraction Spots using Learning Algorithms

2019 International journal of recent technology and engineering  
For this purpose four algorithms such as Kernel Density Estimation, K- Nearest Neighbor, Random forest and XG Boost have been used.  ...  The findings revealed that XG Boost yields better results in terms of accuracy than other three algorithms.  ...  We used four machine learning algorithms such as KNN, KDE, Random forest, and XG Boost to train and test the classifiers for tourism-related opinion mining.  ... 
doi:10.35940/ijrte.c4241.098319 fatcat:cicp6vm25zd3tnecdw7cjz7z4q
« Previous Showing results 1 — 15 out of 67,636 results