8,094 Hits in 3.9 sec

Supervised Discriminative Sparse PCA with Adaptive Neighbors for Dimensionality Reduction [article]

Zhenhua Shi, Dongrui Wu, Jian Huang, Yu-Kai Wang, Chin-Teng Lin
2020 arXiv   pre-print
As a result, both global and local data structures, as well as the label information, are used for better dimensionality reduction.  ...  Dimensionality reduction is an important operation in information visualization, feature extraction, clustering, regression, and classification, especially for processing noisy high dimensional data.  ...  Projective unsupervised flexible embedding with optimal graph [26] combines PCAN and ridge regression for image and video representation.  ... 
arXiv:2001.03103v2 fatcat:kw4whldrfzguzpo5ldxyu3ndyi

Linear Manifold Regularization with Adaptive Graph for Semi-supervised Dimensionality Reduction

Kai Xiong, Feiping Nie, Junwei Han
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
To overcome the drawbacks, in this paper, we propose a novel approach called linear manifold regularization with adaptive graph (LMRAG) for semi-supervised dimensionality reduction.  ...  Many previous graph-based methods perform dimensionality reduction on a pre-defined graph.  ...  To overcome the drawbacks, it is natural for us to consider how to learn an adaptive graph which is the optimal one for dimensionality reduction.  ... 
doi:10.24963/ijcai.2017/439 dblp:conf/ijcai/XiongNH17 fatcat:rtr566w2ljhwrmcn5k7fq65g7m

Guest editorial: special issue on data mining with matrices, graphs and tensors

Tao Li, Chris Ding, Fei Wang
2011 Data mining and knowledge discovery  
The field of data mining increasingly adapts methods and algorithms from advanced matrix computations, graph theory and optimization.  ...  These matrix-formulated optimization-centric methodologies are rapidly becoming a significant part of data mining, and have evolved into a popular and rapidlyexpanding research area for solving challenging  ...  Geoff Webb for his great help and support in organizing the issue.  ... 
doi:10.1007/s10618-011-0214-1 fatcat:n4mf2ceoqzeitjdwq2opwtlm6e

Classification using non-standard metrics

Barbara Hammer, Thomas Villmann
2005 The European Symposium on Artificial Neural Networks  
Thereby, we focus on general unifying principles of learning using non-standard metrics and metric adaptation.  ...  This procedure is benefitial for data in euclidean space and it is crucial for more complex data structures such as occur in bioinformatics or natural language processing.  ...  This idea includes l p (Minkowski) norms with p = 2 [16] , reduction of the dimensionality [14] , or feature selection as summarized e.g. in [24] .  ... 
dblp:conf/esann/HammerV05 fatcat:xum262rsy5fexlmufa7lsw7ldu

Unsupervised Machine Learning for Networking: Techniques, Applications and Research Challenges

Muhammad Usama, Junaid Qadir, Aunn Raza, Hunain Arif, Kok-lim Alvin Yau, Yehia Elkhatib, Amir Hussain, Ala Al-Fuqaha
2019 IEEE Access  
and optimal control (e.g., for developing autonomous self-driving cars).  ...  In addition, unsupervised learning can unconstrain us from the need for labeled data and manual handcrafted feature engineering, thereby facilitating flexible, general, and automated methods of machine  ...  a lot of promises for advancing the state of the art in networking in terms of adaptability, flexibility, and efficiency.  ... 
doi:10.1109/access.2019.2916648 fatcat:xutxh3neynh4bgcsmugxsclkna

Unsupervised Machine Learning for Networking: Techniques, Applications and Research Challenges [article]

Muhammad Usama, Junaid Qadir, Aunn Raza, Hunain Arif, Kok-Lim Alvin Yau, Yehia Elkhatib, Amir Hussain, Ala Al-Fuqaha
2017 arXiv   pre-print
Unsupervised learning is interesting since it can unconstrain us from the need of labeled data and manual handcrafted feature engineering thereby facilitating flexible, general, and automated methods of  ...  control (e.g., for developing autonomous self-driving cars).  ...  a lot of promises for advancing the state of the art in networking in terms of adaptability, flexibility, and efficiency.  ... 
arXiv:1709.06599v1 fatcat:llcg6gxgpjahha6bkhsitglrsm

Object Tracking via Dynamic Feature Selection Processes [article]

Giorgio Roffo, Simone Melzi
2016 arXiv   pre-print
A feature selection mechanism is embedded in the Adaptive colour Names (CN) tracking system that adaptively selects the top-ranked discriminative features for tracking.  ...  By using a fast online algorithm for learning dictionaries the size of the box is adapted during the processing.  ...  We also reduce the learning rate for the adaptive dimensionality reduction to 0.1. The number of selected features is up to 8, and the dimensionality of the compressed features is 4.  ... 
arXiv:1609.01958v1 fatcat:h745ggf4zfgcbmeywfb26gwl3y

Dimensionality Reduction for Spectral Clustering

Donglin Niu, Jennifer G. Dy, Michael I. Jordan
2011 Journal of machine learning research  
We optimize this functional over both the projection and the spectral embedding.  ...  Spectral clustering is a flexible clustering methodology that is applicable to a variety of data types and has the particular virtue that it makes few assumptions on cluster shapes.  ...  Section 3 presents sufficient dimensionality reduction for unsupervised learning and relates it to spectral clustering.  ... 
dblp:journals/jmlr/NiuDJ11 fatcat:xfzfpqqlmrbxbdkt744ayg5yra

Introduction to the Issue on Robust Subspace Learning and Tracking: Theory, Algorithms, and Applications

T. Bouwmans, N. Vaswani, P. Rodriguez, R. Vidal, Z. Lin
2018 IEEE Journal on Selected Topics in Signal Processing  
To remedy high computation complexity in graph-based methods, Yu et al. propose an unsupervised graph-based dimensionality reduction method named Fast and Flexible Large Graph Embedding based on anchors  ...  FFLGE constructs an anchorbased graph and designs similarity matrix and then perform the dimensionality reduction efficiently.  ... 
doi:10.1109/jstsp.2018.2879245 fatcat:z3ohqdl37nat3pjo65fzsf2ady

Unsupervised deep learning on biomedical data with BoltzmannMachines.jl [article]

Stefan Lenz, Moritz Hess, Harald Binder
2019 bioRxiv   pre-print
AbstractDeep Boltzmann machines (DBMs) are models for unsupervised learning in the field of artificial intelligence, promising to be useful for dimensionality reduction and pattern detection in clinical  ...  We present the package "BoltzmannMachines" for the Julia programming language, which makes this model class available for practical use in working with biomedical data.AvailabilityNotebook with example  ...  We focus on Deep Boltzmann Machines (DBMs) (Srivastava and Salakhutdinov, 2014) because these allow for flexible conditional sampling, and we have already adapted them for training with small sample  ... 
doi:10.1101/578252 fatcat:tjiqa3ttjvaqdgldia67xzek64

Learning With $\ell ^{1}$-Graph for Image Analysis

Bin Cheng, Jianchao Yang, Shuicheng Yan, Yun Fu, T.S. Huang
2010 IEEE Transactions on Image Processing  
Compared with the conventional -nearest-neighbor graph and -ball graph, the 1 -graph possesses the advantages: 1) greater robustness to data noise, 2) automatic sparsity, and 3) adaptive neighborhood for  ...  The graph construction procedure essentially determines the potentials of those graph-oriented learning algorithms for image analysis.  ...  graph is flexible in terms of the selection of similarity/distance measurement, but the optimality is heavily data dependent.  ... 
doi:10.1109/tip.2009.2038764 pmid:20031500 fatcat:lbju2dvonvb2hijji55ueqme6a

Adaptive Hypergraph Learning for Unsupervised Feature Selection

Xiaofeng Zhu, Yonghua Zhu, Shichao Zhang, Rongyao Hu, Wei He
2017 Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence  
Current unsupervised feature selection (UFS) methods learn the similarity matrix by using a simple graph which is learnt from the original data as well as is independent from the process of feature selection  ...  This verified our conclusion again, i.e., it is necessary to conduct dimensionality reduction on high-dimensional data.  ...  To address the above issues, in this paper, we propose a new unsupervised embedded feature selection method, namely Adaptively Hypergraph Learning for Feature Selection (AHLFS), involving three components  ... 
doi:10.24963/ijcai.2017/501 dblp:conf/ijcai/ZhuZZHH17 fatcat:xha7r2f3erbapairz3qyllgyki

Adaptive Unsupervised Multi-view Feature Selection for Visual Concept Recognition [chapter]

Yinfu Feng, Jun Xiao, Yueting Zhuang, Xiaoming Liu
2013 Lecture Notes in Computer Science  
Therefore, we propose an unsupervised learning method called Adaptive Unsupervised Multi-view Feature Selection (AUMFS) in this paper.  ...  Then, AUMFS integrates data cluster labels prediction and adaptive multi-view visual similar graph learning into a unified framework.  ...  vector to form the objective function of adaptive multi-view visual similar graph learning, which leverages the correlations between different views and establishes adaptive weights for each view.  ... 
doi:10.1007/978-3-642-37331-2_26 fatcat:4qumx6p6qrhutihtdb4f7q3gbi

Geometry-aware metric learning

Zhengdong Lu, Prateek Jain, Inderjit S. Dhillon
2009 Proceedings of the 26th Annual International Conference on Machine Learning - ICML '09  
We demonstrate wide applicability and effectiveness of our framework by applying to various machine learning tasks such as semisupervised classification, colored dimensionality reduction, manifold alignment  ...  In this paper, we introduce a generic framework for semi-supervised kernel learning.  ...  ., 2007) method which also performs dimensionality reduction for labelled data.  ... 
doi:10.1145/1553374.1553461 dblp:conf/icml/LuJD09 fatcat:5utweswrlvg4pmywbeqwvlij3u

Dimensionality Reduction for Data in Multiple Feature Representations

Yen-Yu Lin, Tyng-Luh Liu, Chiou-Shann Fuh
2008 Neural Information Processing Systems  
It follows that any dimensionality reduction techniques explainable by graph embedding can be generalized by our method to consider data in multiple feature representations.  ...  We describe an approach that incorporates multiple kernel learning with dimensionality reduction (MKL-DR).  ...  Our approach leverages such an MKL optimization to yield more flexible dimensionality reduction schemes for data in different feature representations.  ... 
dblp:conf/nips/LinLF08 fatcat:q3gp3rh2fjdp3cgc3tpvndvvh4
« Previous Showing results 1 — 15 out of 8,094 results