648,186 Hits in 6.6 sec

Random Projection Neural Network Approximation

Peter Andras
2018 2018 International Joint Conference on Neural Networks (IJCNN)  
Neural networks are often used to approximate functions defined over high-dimensional data spaces (e.g. text data, genomic data, multi-sensor data).  ...  this lower dimensional projection data space.  ...  PERFORMANCE COMPARISON OF NEURAL NETWORKS TRAINED WITH HIGH-DIMENSIONAL AND LOW- DIMENSIONAL DATA Performance measure: mean squared errors over 20 data sets Function High-dim Data - average, (  ... 
doi:10.1109/ijcnn.2018.8489215 dblp:conf/ijcnn/Andras18 fatcat:qg2z3emttbdezone4spjmam5xa


ZHENG Zhijun, PENG Yanbin
2021 International Journal of Engineering Technologies and Management Research  
Aiming at the problem of "dimension disaster" in hyperspectral image classification, a method of dimension reduction based on manifold data analysis and sparse subspace projection (MDASSP) is proposed.  ...  The sparse coefficient matrix is established by the new method, and the sparse subspace projection is carried out by the optimization method.  ...  The training set is used to learn the projection matrix of low dimensional space. The test set is first projected into low dimensional space by matrix and then classified and identified.  ... 
doi:10.29121/ijetmr.v8.i9.2021.1040 fatcat:ba6j54sypfbjrnebr4v6xj36rm

High-dimensional Bayesian optimization with projections using quantile Gaussian processes

Riccardo Moriconi, K. S. Sesh Kumar, Marc Peter Deisenroth
2019 Optimization Letters  
In this article, we exploit the effective lower dimensionality with axis-aligned projections and optimize on a partitioning of the input space.  ...  Key challenges of Bayesian optimization in high dimensions are both learning the response surface and optimizing an acquisition function.  ...  Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution  ... 
doi:10.1007/s11590-019-01433-w fatcat:6r6fka3gtvc2hh3drxvfu36yoq

A new variant of radial visualization for supervised visualization of high dimensional data

Long Tran Van, Nguyen Dinh Thi
2019 The Transport and Communications Science Journal  
Our method provides an improvement in visualizing cluster structures of high dimensional data sets on the Radial Visualization.  ...  In this article, we introduced a new variant of Radial Visualization for visualizing high dimensional data set that named Arc Radial Visualization.  ...  This approach generates a large of number of candidate projections for high- 163 dimensional data. One of the challenging tasks is find the best projection for discovery structure of data.  ... 
doi:10.25073/tcsj.70.3.24 fatcat:ziwh3d5n3jhxbe4p5vzthzww5u

Exploring High-Dimensional Structure via Axis-Aligned Decomposition of Linear Projections [article]

Jayaraman J. Thiagarajan, Shusen Liu, Karthikeyan Natesan Ramamurthy, Peer-Timo Bremer
2017 arXiv   pre-print
Two-dimensional embeddings remain the dominant approach to visualize high dimensional data.  ...  Furthermore, we introduce a new approach to discover a diverse set of high quality linear projections and show that in practice the information of k linear projections is often jointly encoded in ∼ k axis  ...  high dimensional data requires a diverse set of views.  ... 
arXiv:1712.07106v2 fatcat:jivmuddapzbizkxfyiirtt3cs4

A general approach for similarity-based linear projections using a genetic algorithm

James A. Mouradian, Bernd Hamann, René Rosenbaum, Pak Chung Wong, David L. Kao, Ming C. Hao, Chaomei Chen, Robert Kosara, Mark A. Livingston, Jinah Park, Ian Roberts
2012 Visualization and Data Analysis 2012  
We propose a general-purpose genetic algorithm to develop linear projections of high-dimensional data sets which preserve a specified quality of the data set as much as possible.  ...  A widely applicable approach to visualizing properties of high-dimensional data is to view the data as a linear projection into two-or three-dimensional space.  ...  ACKNOWLEDGMENTS The authors gratefully acknowledge the support of Deutsche Forschungsgemeinschaft (DFG) for partially funding this research (#RO3755/1-1).  ... 
doi:10.1117/12.909485 dblp:conf/vda/MouradianHR12 fatcat:4gtut4srpbgbpbjecg6y5kvocq

Visualizing Large-scale and High-dimensional Data

Jian Tang, Jingzhou Liu, Ming Zhang, Qiaozhu Mei
2016 Proceedings of the 25th International Conference on World Wide Web - WWW '16  
We study the problem of visualizing large-scale and high-dimensional data in a low-dimensional (typically 2D or 3D) space.  ...  The whole procedure thus easily scales to millions of high-dimensional data points.  ...  Acknowledgments The co-author Ming Zhang is supported by the National Natural Science Foundation of China (NSFC Grant No. 61472006 and 61272343); Qiaozhu Mei is supported by the National Science Foundation  ... 
doi:10.1145/2872427.2883041 dblp:conf/www/TangLZM16 fatcat:rhkeswizuzfsxo3netikf7jdey

Projection Pursuit for Exploratory Supervised Classification

Eun-Kyung Lee, Dianne Cook, Sigbert Klinke, Thomas Lumley
2005 Journal of Computational And Graphical Statistics  
Projection pursuit is a procedure for searching high-dimensional data for interesting low-dimensional projections via the optimization of a criterion function called the projection pursuit index.  ...  ABSTRACT In high-dimensional data, one often seeks a few interesting low-dimensional projections that reveal important features of the data.  ...  Examining higher than 1-dimensional projections is important for visual inspection of high-dimensional data.  ... 
doi:10.1198/106186005x77702 fatcat:fswubylepzgftbzc3xzvnxx5xi

Orthogonal self-guided similarity preserving projections

Xiaozhao Fang, Yong Xu, Zheng Zhang, Zhihui Lai, Linlin Shen
2015 2015 IEEE International Conference on Image Processing (ICIP)  
coefficients of the projected data are used to encode the similarity structure information.  ...  In this paper, we propose a novel unsupervised dimensionality reduction (DR) method called orthogonal self-guided similarity preserving projections (OSSPP), which seamlessly integrates the procedures of  ...  Table 1 shows that classification results on these data sets, in which #T r and d denote the optimal number of training samples selected from each subject of the data set and the optimal dimensionality  ... 
doi:10.1109/icip.2015.7350817 dblp:conf/icip/FangXZLS15 fatcat:hryharoxkzeazb34mtis4tv7ja

Energy-constrained discriminant analysis

Scott Philips, Visar Berisha, Andreas Spanias
2009 2009 IEEE International Conference on Acoustics, Speech and Signal Processing  
Linear discriminant analysis (LDA) is a popular analysis technique used to project high-dimensional data into a lower-dimensional space while maximizing class separability.  ...  Dimensionality reduction algorithms have become an indispensable tool for working with high-dimensional data in classification.  ...  Linear discriminant analysis identifies a linear transform that projects high dimensional data into a low dimensional space subject to a classification constraint.  ... 
doi:10.1109/icassp.2009.4960325 dblp:conf/icassp/PhilipsBS09 fatcat:ppbetqmmgfgkxe7pf5mhmd25za

Dual Random Projection for Linear Support Vector Machine

Xi XI, Feng-qin ZHANG, Xiao-qing LI
2017 DEStech Transactions on Computer Science and Engineering  
The Random Projections (RP) method can solve the problem of dimensionality reduction of high-dimensional data quickly and effectively to reduce the computational cost of the related optimization problem  ...  Support Vector Machine (SVM) is a popular machine learning methods in the field of data analysis.  ...  of high-dimensional data in order to reduce the computational cost of related optimization problems.  ... 
doi:10.12783/dtcse/smce2017/12422 fatcat:ff4hhyqfcfbp7c2y4usk7gq2ee

Optimization Method for Crop Growth Characteristics Based on Improved Locality Preserving Projection

Jia Dongyao, Hu Po, Zou Shengxiong
2014 Journal of Applied Mathematics  
Firstly, preliminary dimensionality reduction of sample data is constructed by using two-dimensional principal component analysis (2DPCA) to retain the spatial information.  ...  An improved locality preserving projection algorithm is proposed to optimize the extraction of growth characteristics.  ...  After the projection 𝑌 = 𝐻 𝑇 𝑋, the expression of high-dimension data set 𝑋 can be described in the new characteristic space as 𝑌 = [𝑦 1 , 𝑦 2 , . . . , 𝑦 𝑁 ].  ... 
doi:10.1155/2014/809597 fatcat:vedxrvs7q5dmdg65vsxvqiqe5e

Bilinear Discriminant Analysis Hashing: A Supervised Hashing Approach for High-Dimensional Data [chapter]

Yanzhen Liu, Xiao Bai, Cheng Yan, Jun Zhou
2017 Lecture Notes in Computer Science  
On the other hand, existing supervised hashing methods cannot work well on high-dimensional datasets, as they consume too much time and memory to index high-dimensional data.  ...  Bilinear projection needs two small matrices rather than one big matrix to project data so that the coding time and memory consumption are drastically reduced.  ...  This work was supported by NSFC project No.61370123 and BNSF project No.4162037.  ... 
doi:10.1007/978-3-319-54193-8_19 fatcat:6tnajwmu7bh27mq34v3bhozw4i

Stochastic discriminant analysis for linear supervised dimension reduction

Mika Juuti, Francesco Corona, Juha Karhunen
2018 Neurocomputing  
We have made experiments with various types of data sets having low, medium, or high dimensions and quite different numbers of samples, and with both sparse and dense data sets.  ...  If there are several classes in the studied data set, the lowdimensional projections computed using our SDA method provide often higher classification accuracies than the compared methods.  ...  The baseline is the classification accuracy in the original high-dimensional data set. λ = 10 −0 . 5 ≈ 0 . 32 .  ... 
doi:10.1016/j.neucom.2018.02.064 fatcat:u4bbsquivrdyroboc4wa76wzgm

Randomly Projected Additive Gaussian Processes for Regression [article]

Ian A. Delbridge, David S. Bindel, Andrew Gordon Wilson
2019 arXiv   pre-print
inputs, over a wide range of data sets, even if we are projecting into a single dimension.  ...  Learning a low dimensional projection can help alleviate this curse of dimensionality, but introduces many trainable hyperparameters, which can be cumbersome, especially in the small data regime.  ...  Tests on very high-dimensional data sets Above in Figure 7 are the full set of plots corresponding to tests on the very high-dimensional data sets.  ... 
arXiv:1912.12834v1 fatcat:pya4hugxtfdqfom6d6gu6qa7hm
« Previous Showing results 1 — 15 out of 648,186 results