Filters








590,214 Hits in 7.7 sec

Optimal combination of feature selection and classification via local hyperplane based learning strategy

Xiaoping Cheng, Hongmin Cai, Yue Zhang, Bo Xu, Weifeng Su
2015 BMC Bioinformatics  
Through classification accuracy-based iterations, LHDA obtains the feature weight vector and finally extracts the optimal feature subset.  ...  First, it uses a local approximation rather than global measurement; second, it embeds a recently reported classification model, K-Local Hyperplane Distance Nearest Neighbor(HKNN) classifier, into its  ...  Sun for the source code of I-RELIEF and Miss Peiyin Ruan for her help in compiling the codes from RankGene.  ... 
doi:10.1186/s12859-015-0629-6 pmid:26159165 pmcid:PMC4498526 fatcat:ahk27bxtzzdvdonwqjogxjbcni

Learning to Localize Actions from Moments [article]

Fuchen Long and Ting Yao and Zhaofan Qiu and Xinmei Tian and Jiebo Luo and Tao Mei
2020 arXiv   pre-print
Technically, a weight transfer function is uniquely devised to build the transformation between classification of action moments or foreground video segments and action localization in synthetic contextual  ...  Source code and data are available at .  ...  Weight Transfer Between Classification and Localization Given the feature maps of an untrimmed video in 1D ConvNet, temporal boundary regression and action classification can be optimized for each anchor  ... 
arXiv:2008.13705v1 fatcat:pmgj2u7ulzdbnc6vfmqccl7hyy

Efficient Local Flexible Nearest Neighbor Classification [chapter]

Carlotta Domeniconi, Dimitrios Gunopulos
2002 Proceedings of the 2002 SIAM International Conference on Data Mining  
Such direction provides a local weighting scheme for input features.  ...  both simulated and real data sets.  ...  The gradient vector computed at points on the boundary allows us to capture such information, and to use it for measuring local feature relevance and weighting features accordingly.  ... 
doi:10.1137/1.9781611972726.21 dblp:conf/sdm/DomeniconiG02 fatcat:vq2sczslajgmjhtujtvbo7ugfy

Research on Long Text Classification Model Based on Multi-Feature Weighted Fusion

Xi Yue, Tao Zhou, Lei He, Yuxia Li
2022 Applied Sciences  
combine attention mechanisms to obtain weighted local features, fuse global contextual features with weighted local features, and obtain classification results by equal-length convolutional pooling.  ...  A long text classification model based on multi-feature weighted fusion is proposed for the problems of contextual semantic relations, long-distance global relations, and multi-sense words in long text  ...  fusion of multi-level weighted local features with global features is validated for long text classification tasks.  ... 
doi:10.3390/app12136556 fatcat:wlmbriaxbzhw5hme5qz4qtj6nu

A Comparative Study for Condition Monitoring on Wind Turbine Blade using Vibration Signals through Statistical Features: a Lazy Learning Approach

A. Joshuva, V. Sugumaran
2018 International Journal of Engineering & Technology  
In phase 3, the selected features are classified using machine learning classifiers namely, K-star (KS), locally weighted learning (LWL), nearest neighbour (NN), k-nearest neighbours (kNN), instance based  ...  Here, the study is carried out in three phases namely, feature extraction, feature selection and feature classification.  ...  for locally weighted learning (LWL) Table 6 : 6 Class-wise accuracy of locally weighted learning (LWL)  ... 
doi:10.14419/ijet.v7i4.10.20833 fatcat:a7cnzwl4rvgvphm4xm5tzkjlmq

Multiple view based 3D object classification using ensemble learning of local subspaces

Jianing Wu, Kazuhiro Fukui
2008 Pattern Recognition (ICPR), Proceedings of the International Conference on  
Firstly we attempt to approximate a distribution of feature vectors with multiple local subspaces.  ...  Multiple observation improves the performance of 3D object classification.  ...  Do for t = 1, ..., N , (N is the number of learning data) Record the number of classification error E B c j for each B c j 3 Calculate the training error rate ε B c j as E B c j /N 4 Calculate weight:  ... 
doi:10.1109/icpr.2008.4761356 dblp:conf/icpr/WuF08 fatcat:ld5gdmsugjgu7edcw2wjbqa2km

Large Margin Nearest Neighbor Classifiers

C. Domeniconi, D. Gunopulos, J. Peng
2005 IEEE Transactions on Neural Networks  
Such a direction provides a local feature weighting scheme. We formally show that our method increases the margin in the weighted space where classification takes place.  ...  We demonstrate the efficacy of our method using both real and simulated data. Index Terms-Feature relevance, margin, nearest neighbor classification, support vector machines (SVMs).  ...  The gradient vector computed at points on the boundary allows us to capture such information, and to use it for measuring local feature relevance and weighting features accordingly.  ... 
doi:10.1109/tnn.2005.849821 pmid:16121731 fatcat:p7dgrzokz5aefg24aki7wskp3i

Adaptive Nearest Neighbor Classification Using Support Vector Machines

Carlotta Domeniconi, Dimitrios Gunopulos
2001 Neural Information Processing Systems  
Such direction provides a local weighting scheme for input features.  ...  We present experimental evidence of classification performance improvement over the SVM algorithm alone and over a variety of adaptive learning schemes, by using both simulated and real data sets.  ...  Table 1 : 1 Average classification error rates for simulated and real data.  ... 
dblp:conf/nips/DomeniconiG01 fatcat:hsg4pbumsvahvki5fugjksc53e

Acoustic Scene Classification Using Efficient Summary Statistics and Multiple Spectro-Temporal Descriptor Fusion

Jiaxing Ye, Takumi Kobayashi, Nobuyuki Toyama, Hiroshi Tsuda, Masahiro Murakawa
2018 Applied Sciences  
A more critical issue raised in how to logically 'summarize' those local details into a compact feature vector for scene classification.  ...  Subsequently, robust acoustic feature for scene classification can be efficiently characterized.  ...  data and classification hyperplane in the (kernel) feature space.  ... 
doi:10.3390/app8081363 fatcat:j3kn5jsycfdc5ckh2y5akmbzma

LDA/SVM driven nearest neighbor classification

Jing Peng, D.R. Heisterkamp, H.K. Dai
2003 IEEE Transactions on Neural Networks  
We use local support vector machine learning to estimate an effective metric for producing neighborhoods that are elongated along less discriminant feature dimensions and constricted along most discriminant  ...  We propose a locally adaptive neighborhood morphing classification method to try to minimize bias.  ...  This paper presents an adaptive metric method for effective pattern classification.  ... 
doi:10.1109/tnn.2003.813835 pmid:18238072 fatcat:5c5fgw3idrdn5e7flbl6wio5lq

Multiscale Dense Cross-Attention Mechanism with Covariance Pooling for Hyperspectral Image Scene Classification

Runmin Liu, Xin Ning, Weiwei Cai, Guangjun Li
2021 Mobile Information Systems  
For traditional algorithms that assign attention weights in a one-way manner, thus leading to the loss of feature information, the dense cross-attention mechanism proposed in this study can jointly distribute  ...  However, in the classification process, high-dimensionality problems with large amounts of data and feature redundancy with interspectral correlation of hyperspectral images have not been solved efficiently  ...  However, it is time-consuming and labor-intensive to label the data used for HSI classification and to use it for training. e annotation data are very limited.  ... 
doi:10.1155/2021/9962057 doaj:fe0b8170afd44a0da84b35c6a278c08f fatcat:w3opirm7nzezpgp6xztmdwopr4

Boosting local feature descriptors for automatic objects classification in traffic scene surveillance

Zhaoxiang Zhang, Min Li, Kaiqi Huang, Tieniu Tan
2008 Pattern Recognition (ICPR), Proceedings of the International Conference on  
In this paper, we propose a new strategy for object classification by boosting different local feature descriptors in motion blobs.  ...  We not only evaluate the performance of each local feature descriptor, but also fuse these descriptors to achieve better performance.  ...  The authors also thank the anonymous reviewers for their valuable comments.  ... 
doi:10.1109/icpr.2008.4761317 dblp:conf/icpr/ZhangLHT08 fatcat:b3zdeylylzhe5megndtzevqt3i

A feature fusion based localized multiple kernel learning system for real world image classification

Fatemeh Zamani, Mansour Jamzad
2017 EURASIP Journal on Image and Video Processing  
To provide a solution for the second challenge, we use the idea of a localized MKL by assigning separate local weights to each kernel.  ...  This paper proposes a feature fusion based multiple kernel learning (MKL) model for image classification. By using multiple kernels extracted from multiple features, we address the first challenge.  ...  Availability of data and materials Not applicable. Competing interests The authors declare that they have no competing interests.  ... 
doi:10.1186/s13640-017-0225-y fatcat:tug4nshccrgyjnwsqvpbaqunji

Distributed hoeffding trees for pocket data mining

Frederic Stahl, Mohamed Medhat Gaber, Max Bramer, Philip S Yu
2011 2011 International Conference on High Performance Computing & Simulation  
Hoeffding trees techniques have been experimentally and analytically validated for data stream classification.  ...  In this paper, we have proposed, developed and evaluated the adoption of distributed Hoeffding trees for classifying streaming data in PDM applications.  ...  Table 2 . 2 Weights and Local Accuracies for Outliers  ... 
doi:10.1109/hpcsim.2011.5999893 dblp:conf/ieeehpcs/StahlGBY11 fatcat:ufq2ufuxtbaafl6udgtsvfjqbq

Feature Subset Selection for Cancer Classification Using Weight Local Modularity

Guodong Zhao, Yan Wu
2016 Scientific Reports  
This study develops a novel feature selection (FS) method for gene subset selection by utilizing the Weight Local Modularity (WLM) in a complex network, called the WLMGS.  ...  In the proposed method, the discriminative power of gene subset is evaluated by using the weight local modularity of a weighted sample graph in the gene subset where the intra-class distance is small and  ...  Acknowledgements The authors are very grateful to the reviewers for their thorough reading, many valuable comments and rather helpful suggestions and also thank the anonymous editor a lot, for the helpful  ... 
doi:10.1038/srep34759 pmid:27703256 pmcid:PMC5050509 fatcat:ygdupk6quvhgtpux5axlimt63q
« Previous Showing results 1 — 15 out of 590,214 results