Filters








14,196 Hits in 5.8 sec

High-dimensional Similarity Learning via Dual-sparse Random Projection

Dezhong Yao, Peilin Zhao, Tuan-Anh Nguyen Pham, Gao Cong
2018 Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence  
We investigate how to adopt dual random projection for high-dimensional similarity learning.  ...  Those assumptions limit dual random projection applications in similarity learning.  ...  In this paper, we proposed an efficient dual random projection framework for high-dimensional similarity learning task. Our solution avoids the computational cost of O(d 3 ).  ... 
doi:10.24963/ijcai.2018/417 dblp:conf/ijcai/YaoZPC18 fatcat:z5shbaa4kfgpzoufwpcz3a2kte

Sparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach [chapter]

Lijun Zhang, Tianbao Yang, Rong Jin, Zhi-Hua Zhou
2016 Lecture Notes in Computer Science  
Specifically, the proposed approach combines the strength of random projection with that of sparse learning: it utilizes random projection to reduce the dimensionality, and introduces 1-norm regularization  ...  In this paper, we develop a randomized algorithm and theory for learning a sparse model from large-scale and high-dimensional data, which is usually formulated as an empirical risk minimization problem  ...  Our work is closely related to Dual Random Projection (DRP) [35, 36] and Dual-sparse Regularized Randomized Reduction (DSRR) [34] , which also investigate random projection from the perspective of optimization  ... 
doi:10.1007/978-3-319-46379-7_6 fatcat:7mnz2mhoxjbk7cjeqhkcixgfeq

Fine-Grained Visual Categorization via Multi-stage Metric Learning [article]

Qi Qian, Rong Jin, Shenghuo Zhu, Yuanqing Lin
2015 arXiv   pre-print
This paper proposes to explicitly address the above two issues via distance metric learning (DML).  ...  To this end, we proposed a multi-stage metric learning framework that divides the large-scale high dimensional learning problem to a series of simple subproblems, achieving O(d) computational complexity  ...  Computational Challenge: Dual Random Projection Now we try to solve the high dimensional subproblem by dual random projection technique.  ... 
arXiv:1402.0453v2 fatcat:yrlejbqgkvgqrkjkg4oxjxbraa

Recovering the Optimal Solution by Dual Random Projection [article]

Lijun Zhang, Mehrdad Mahdavi, Rong Jin, Tianbao Yang, Shenghuo Zhu
2014 arXiv   pre-print
to the original optimization problem in the high-dimensional space based on the solution learned from the subspace spanned by random projections.  ...  We present a simple algorithm, termed Dual Random Projection, that uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem.  ...  Introduction Random projection is a simple yet powerful dimensionality reduction technique that projects the original high-dimensional data onto a low-dimensional subspace using a random matrix (Kaski  ... 
arXiv:1211.3046v4 fatcat:yilsamwdwrhstk43r7zrmp7bbq

Fine-grained visual categorization via multi-stage metric learning

Qi Qian, Rong Jin, Shenghuo Zhu, Yuanqing Lin
2015 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
This paper proposes to explicitly address the above two issues via distance metric learning (DML).  ...  To this end, we proposed a multi-stage metric learning framework that divides the large-scale high dimensional learning problem to a series of simple subproblems, achieving O(d) computational complexity  ...  Computational Challenge: Dual Random Projection Now we try to solve the high dimensional subproblem by dual random projection technique.  ... 
doi:10.1109/cvpr.2015.7298995 dblp:conf/cvpr/QianJZL15 fatcat:vp5xdwp2bfcglaowe6u5kd434q

Multiple-view object recognition in band-limited distributed camera networks

Allen Y. Yang, Subhransu Maji, C. Mario Christoudias, Trevor Darrell, Jitendra Malik, S. Shankar Sastry
2009 2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC)  
Such joint sparse patterns can be explicitly exploited to accurately encode the distributed signal via random projection, which is unsupervised and independent to the sensor modality.  ...  In particular, we show that between a network of cameras, high-dimensional SIFT histograms share a joint sparse pattern corresponding to a set of common features in 3-D.  ...  Since the 1 -min scheme provides a means to recover the original SIFT histogram in the high-dimensional space, when the dimension of random projection becomes sufficiently high, the accuracy via PFP surpasses  ... 
doi:10.1109/icdsc.2009.5289410 dblp:conf/icdsc/YangMCDMS09 fatcat:bix4xkfofjfd5ccd6pjdk4fhka

A Dual Algorithm for Olfactory Computation in the Locust Brain

Sina Tootoonian, Máté Lengyel
2014 Neural Information Processing Systems  
We first propose its computational function as recovery of high-dimensional sparse olfactory signals from a small number of measurements.  ...  sparse recovery.  ...  in the high (N ) dimensional space of KC activites.  ... 
dblp:conf/nips/TootoonianL14 fatcat:etzjxkaugbcdtc2faqhnzxfzxq

Compressed Sensing, Sparsity, and Dimensionality in Neuronal Information Processing and Data Analysis

Surya Ganguli, Haim Sompolinsky
2012 Annual Review of Neuroscience  
And second, how do brains themselves process information in their intrinsically high-dimensional patterns of neural activity as well as learn meaningful, generalizable models of the external world from  ...  We review recent mathematical advances that provide ways to combat dimensionality in specific situations. These advances shed light on two dual questions in neuroscience.  ...  A simple mechanism would be to project the dense activity patterns into a larger pool of neurons via random divergent projections and use high spiking thresholds to ensure sparsity of the target activity  ... 
doi:10.1146/annurev-neuro-062111-150410 pmid:22483042 fatcat:epgzf57y4zdejk6oszpdgl26te

Sparse, Dense, and Attentional Representations for Text Retrieval [article]

Yi Luan, Jacob Eisenstein, Kristina Toutanova, Michael Collins
2021 arXiv   pre-print
Building on these insights, we propose a simple neural model that combines the efficiency of dual encoders with some of the expressiveness of more costly attentional architectures, and explore sparse-dense  ...  hybrids to capitalize on the precision of sparse retrieval.  ...  Wieting and Kiela (2019) represent sentences as bags of random projections, finding that high-dimensional projections (k = 4096) perform nearly as well as trained encoding models.  ... 
arXiv:2005.00181v3 fatcat:m67dpmdwond2vl6naoabeghvai

Sparse, Dense, and Attentional Representations for Text Retrieval

Yi Luan, Jacob Eisenstein, Kristina Toutanova, Michael Collins
2021 Transactions of the Association for Computational Linguistics  
Dual encoders perform retrieval by encoding documents and queries into dense low-dimensional vectors, scoring each document by its inner product with the query.  ...  Building on these insights, we propose a simple neural model that combines the efficiency of dual encoders with some of the expressiveness of more costly attentional architectures, and explore sparse-dense  ...  Wieting and Kiela (2019) represent sentences as bags of random projections, finding that high-dimensional projections (k = 4096) perform nearly as well as trained encoding models.  ... 
doi:10.1162/tacl_a_00369 fatcat:pefq2xpyorg5ply236nmfa4xjq

Gradient Boosted Decision Trees for High Dimensional Sparse Output

Si Si, Huan Zhang, S. Sathiya Keerthi, Dhruv Mahajan, Inderjit S. Dhillon, Cho-Jui Hsieh
2017 International Conference on Machine Learning  
In this paper, we study the gradient boosted decision trees (GBDT) when the output space is high dimensional and sparse.  ...  existing methods, while yielding similar performance.  ...  Both random projection and PCA are un-supervised learn- ing approaches-in the sense that they do not use any label information; however, in our problem setting there is rich information in the high dimensional  ... 
dblp:conf/icml/SiZKMDH17 fatcat:c2gswffdbrccbcxypvt4c5glfe

Working memory inspired hierarchical video decomposition with transformative representations [article]

Binjie Qin, Haohao Mao, Ruipeng Zhang, Yueqi Zhu, Song Ding, Xu Chen
2022 arXiv   pre-print
Then, patch recurrent convolutional LSTM networks with a backprojection module embody unstructured random representations of the control layer in working memory, recurrently projecting spatiotemporally  ...  Video decomposition is very important to extract moving foreground objects from complex backgrounds in computer vision, machine learning, and medical imaging, e.g., extracting moving contrast-filled vessels  ...  The low dimensionality stems from the high correlation existing among the X-ray attenuation coefficients, and self-similarity is common in natural images and means that they contain many similar patches  ... 
arXiv:2204.10105v3 fatcat:ifzpeay2qjfvbaznwruwc4dz5m

Real-Time Compressive Tracking [chapter]

Kaihua Zhang, Lei Zhang, Ming-Hsuan Yang
2012 Lecture Notes in Computer Science  
Our appearance model employs nonadaptive random projections that preserve the structure of the image feature space of objects.  ...  The tracking task is formulated as a binary classification via a naive Bayes classifier with online update in the compressed domain.  ...  Concluding Remarks In this paper, we proposed a simple yet robust tracking algorithm with an appearance model based on non-adaptive random projections that preserve the structure of original image space  ... 
doi:10.1007/978-3-642-33712-3_62 fatcat:u77rbj2swvca7kuww26rjkthay

Random Projections for Classification: A Recovery Approach

Lijun Zhang, Mehrdad Mahdavi, Rong Jin, Tianbao Yang, Shenghuo Zhu
2014 IEEE Transactions on Information Theory  
solution to the original high-dimensional optimization problem based on the solution learned after random projection.  ...  We present a simple algorithm, termed dual random projection, which uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem.  ...  INTRODUCTION R ANDOM projection is a simple yet powerful dimensionality reduction technique that projects the original high-dimensional data onto a low-dimensional subspace using a random matrix [2] ,  ... 
doi:10.1109/tit.2014.2359204 fatcat:w532tu6zrvccfezbxcezqlhslu

Sparse Learning for Large-scale and High-dimensional Data: A Randomized Convex-concave Optimization Approach [article]

Lijun Zhang, Tianbao Yang, Rong Jin, Zhi-Hua Zhou
2016 arXiv   pre-print
Specifically, the proposed approach combines the strength of random projection with that of sparse learning: it utilizes random projection to reduce the dimensionality, and introduces ℓ_1-norm regularization  ...  In this paper, we develop a randomized algorithm and theory for learning a sparse model from large-scale and high-dimensional data, which is usually formulated as an empirical risk minimization problem  ...  Our work is closely related to Dual Random Projection (DRP) (Zhang et al., 2013 (Zhang et al., , 2014) ) and Dual-sparse Regularized Randomized Reduction (DSRR) , which also investigate random projection  ... 
arXiv:1511.03766v2 fatcat:ri42qxephzaqzh4jsrrzgpfaly
« Previous Showing results 1 — 15 out of 14,196 results