A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is application/pdf
.
Filters
Extraction of Combined Features from Global/Local Statistics of Visual Words Using Relevant Operations
2010
IEICE transactions on information and systems
Because the pairwise combination of visual words is large, we apply feature selection methods including fisher discriminant criterion and L1-SVM. ...
The bag-of-features approach is the most popular approach for generic object recognition [1] because of its simplicity and effectiveness. ...
We calculate J(x j ) for j = 1,..,P and select the largest M features. L1 regularized SVM: L2-norm of w in Support Vector Machine(SVM) [10] are replaced to L1-norm. ...
doi:10.1587/transinf.e93.d.2870
fatcat:f3mto647y5ey5bxmmurjlllwte
A collaborative representation based projections method for feature extraction
2015
Pattern Recognition
CRP utilizes a L2 norm graph to characterize the local compactness information. ...
CRP is much faster than SPP since CRP calculates the objective function with L2 norm while SPP calculate the objective function with L1 norm. ...
The proposed criterion, similar to the classical Fisher criterion, is a Rayleigh quotient form and can be calculated via generalized eigenvalue decomposition. ...
doi:10.1016/j.patcog.2014.07.009
fatcat:uyl4kfqhzfb7rbh7rqcerklio4
Improving Generalization Based on l1-Norm Regularization for EEG-Based Motor Imagery Classification
2018
Frontiers in Neuroscience
In general, a number of parameters are essential for a EEG classification algorithm due to redundant features involved in EEG signals. ...
To decrease the complexity and improve the generalization of EEG method, we present a novel l 1 -norm-based approach to combine the decision value obtained from each EEG channel directly. ...
ACKNOWLEDGMENTS The authors thank the reviewers for their valuable comments and thank the editors for their fruitful work. ...
doi:10.3389/fnins.2018.00272
pmid:29867307
pmcid:PMC5954047
fatcat:g6iprellqfhcze6b7d4vrpoxwi
Effective Discriminative Feature Selection With Nontrivial Solution
2016
IEEE Transactions on Neural Networks and Learning Systems
We impose row sparsity on the transformation matrix of LDA through ℓ2,1-norm regularization to achieve feature selection, and the resultant formulation optimizes for selecting the most discriminative features ...
The formulation is extended to the ℓ2,p-norm regularized case: which is more likely to offer better sparsity when 0 < p < 1. ...
To optimize for the ℓ ∞ -norm, M. Masaeli et al. ...
doi:10.1109/tnnls.2015.2424721
pmid:25993706
fatcat:w4fyjrbxvvaddozsopi2rwy75y
Multiclass Feature Selection With Kernel Gram-Matrix-Based Criteria
2012
IEEE Transactions on Neural Networks and Learning Systems
In this paper, we propose new feature selection methods based on two criteria designed for the optimization of SVM: Kernel Target Alignment and Kernel Class Separability. ...
Feature selection has been an important issue during the last decades to determine the most relevant features according to a given classification problem. ...
We give here a brief review of the methods. with a L2 norm [15] . ...
doi:10.1109/tnnls.2012.2201748
pmid:24808006
fatcat:ub2pe6gjvvhnrhgxdfxzhs5i5m
A Novel Adaptive Parameter Search Elastic Net Method for Fluorescent Molecular Tomography
2021
IEEE Transactions on Medical Imaging
For the selection of elastic net weight parameters, this approach introduces the L0 norm of valid reconstruction results and the L2 norm of the residual vector, which are used to adjust the weight parameters ...
To address these problems, this paper proposes an adaptive parameter search elastic net (APSEN) method that is based on elastic net regularization, using weight parameters to combine the L1 and L2 norms ...
Then, we use the residual vector as the criterion for judging the reconstruction result and select the solution with the smallest L2 norm of the residual vector in the search path as the optimal solution ...
doi:10.1109/tmi.2021.3057704
pmid:33556004
fatcat:yiao7jhru5cufbqrmoxoeqictm
L1-Norm Kernel Discriminant Analysis Via Bayes Error Bound Optimization for Robust Feature Extraction
2014
IEEE Transactions on Neural Networks and Learning Systems
With the L1-norm discriminant criterion, we propose a new linear discriminant analysis (L1-LDA) method for linear feature extraction problem. ...
In contrast to the conventional Fisher's discriminant criterion, the major novelty of the proposed one is the use of L1 norm rather than L2 norm, which makes it less sensitive to the outliers. ...
feature selection. ...
doi:10.1109/tnnls.2013.2281428
pmid:24807955
fatcat:x4xpmquz6be2jobb266vebjdei
Feature extraction using fuzzy complete linear discriminant analysis
2012
2012 IEEE International Conference on Fuzzy Systems
In addition, experiments are provided for analyzing and illustrating our results. ...
In pattern recognition, feature extraction techniques are widely employed to dimensionality reduction. ...
Firstly, We calculate the optimal irregular discriminant vectors. For an arbitrary vector w ∈ N (Ŝ w ), the Fisher criterion will definitely reach infinite. ...
doi:10.1109/fuzz-ieee.2012.6250813
dblp:conf/fuzzIEEE/CuiJ12
fatcat:5c64os2hq5dt5pnuljzeqfvycq
The role of dictionary learning on sparse representation-based classification
2013
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA '13
Representative selection methods which are analyzed in this paper include Metaface dictionary learning, Fisher Discriminative Dictionary Learning (FDDL), Sparse Modeling Representative Selection (SMRS) ...
The first two methods build their own dictionaries via an optimization process while the other two methods select the representatives directly from the original training samples. ...
Step 1: Each column of Di is initialized as a random vector with unit l2 norm. Step 2: Fix Di and solve for Xi for all classes (i = 1, 2, . . . , c). ...
doi:10.1145/2504335.2504385
dblp:conf/petra/ShafieeKAH13
fatcat:jvwd4xagyfcsjoz2adq66fcvny
Weakly Supervised Object Localization with Large Fisher Vectors
2015
Proceedings of the 10th International Conference on Computer Vision Theory and Applications
The main ingredients of our method are a large Fisher vector representation and a sparse classification model enabling efficient evaluation of patch scores. ...
We propose a novel method for learning object localization models in a weakly supervised manner, by employing images annotated with object class labels but not with object locations. ...
ACKNOWLEDGEMENTS This work has been supported by the project VISTA -Computer Vision Innovations for Safe Traffic, IPA2007/HR/16IPO/001-040514 which is cofinanced by the European Union from the European ...
doi:10.5220/0005294900440053
dblp:conf/visapp/KrapacS15
fatcat:yb3yb4cq4bgclcxtgmrqygcn2q
Cross-modal localization through mutual information
2009
2009 IEEE/RSJ International Conference on Intelligent Robots and Systems
As opposed to the conventional L2 regularization, the proposed method leads to faster convergence with much reduced spurious associations. ...
Ability of L1 regularization to enforce sparseness of the solution is exploited to identify a subset of signals that are related to each other, from among a large number of sensor outputs. ...
It is observed that the applying the L1 norm penalty to the optimization produced faster convergence, occurring in iteration 59 compared to 141 iteration with L2 norm penalty (Fig. 6 ) Experiment 2: Laser ...
doi:10.1109/iros.2009.5354200
dblp:conf/iros/AlempijevicKD09
fatcat:4hy7phprhreqvhjqzkrr2wqulm
A Review of Feature Selection and Its Methods
2019
Cybernetics and Information Technologies
The Dimensionality Reduction (DR) can be handled in two ways namely Feature Selection (FS) and Feature Extraction (FE). ...
This paper focuses on a survey of feature selection methods, from this extensive survey we can conclude that most of the FS methods use static data. ...
To achieve interpretation ability the technique uses the feature level self-representation loss function, similarly to produce stability for subspace learning it uses l2, 1-norm regularization. ...
doi:10.2478/cait-2019-0001
fatcat:qmykdldi6fgibihjkdmuhj2sdu
Supervised Kernel Optimized Locality Preserving Projection with Its Application to Face Recognition and Palm Biometrics
2015
Mathematical Problems in Engineering
However, the conventional SKLPP algorithm endures the kernel selection which has significant impact on the performances of SKLPP. ...
Consequently, the nonlinear features extracted by SKOLPP have larger discriminative ability compared with SKLPP and are more adaptive to the input data. ...
Lu and Tan [11] proposed a parametric regularized LPP. Pang and Yuan [12] proposed to substitute L2-norm with L1-norm to improve the robustness of LPP against outliers. ...
doi:10.1155/2015/421671
fatcat:htlmlqak6rbhjfhw4etaku75rm
Codemaps - Segment, Classify and Search Objects Locally
2013
2013 IEEE International Conference on Computer Vision
As first novelty we introduce ℓ 2 normalization for arbitrarily shaped image regions, which is fast enough for semantic segmentation using our Fisher codemaps. ...
Results demonstrate that ℓ 2 normalized Fisher codemaps improve the state-of-the-art in semantic segmentation for PAS-CAL VOC. ...
Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon. ...
doi:10.1109/iccv.2013.454
dblp:conf/iccv/LiGSSS13
fatcat:kfd2xv7jujbgnepotc7gigrlpi
Face Recognition using an Affine Sparse Coding approach
2017
Journal of Artificial Intelligence and Data Mining
In this paper, we propose an Affine Graph Regularized Sparse Coding approach for face recognition problem. ...
Sparse coding has increasing attraction for image classification applications in recent years. ...
The other methods such as graph regularization [11] and using weighted ℓ2-norm constraint are also introduced for improving the sparse representation. ...
doi:10.22044/jadm.2017.890
doaj:3cfd3b4ec7df458abe6391cf4e2f4467
fatcat:a6vudtlbhzda3aoa66zemnip4e
« Previous
Showing results 1 — 15 out of 1,005 results