Filters








20,014 Hits in 3.1 sec

Random hyperplane projection using derived dimensions

Konstantinos Georgoulas, Yannis Kotidis
2010 Proceedings of the Ninth ACM International Workshop on Data Engineering for Wireless and Mobile Access - MobiDE '10  
In our work we investigate a particular form of LSH, termed Random Hyperplane Projection (RHP). RHP is a data agnostic model that works for arbitrary data sets.  ...  Our experimental evaluation using several real datasets demonstrates that our proposed scheme outperforms the existing RHP algorithm providing up to three times more accurate similarity computations using  ...  Handling Evolving Data Sets The proposed RHP (n, m) framework makes use of precomputed statistics in order to boost the accuracy of the standard random hyperplane projection scheme.  ... 
doi:10.1145/1850822.1850827 dblp:conf/mobide/GeorgoulasK10 fatcat:ican5c3dbzfynldaddlrduhcce

Approximate Nearest Subspace Search with Applications to Pattern Recognition

Ronen Basri, Tal Hassner, Lihi Zelnik-Manor
2007 2007 IEEE Conference on Computer Vision and Pattern Recognition  
Further speedup may be achieved by using random projections to lower the dimensionality of the problem.  ...  Linear and affine subspaces are commonly used to describe appearance of objects under different lighting, viewpoint, articulation, and identity.  ...  Random projections are commonly used in ANN searches whenever d log n.  ... 
doi:10.1109/cvpr.2007.383201 dblp:conf/cvpr/BasriHZ07 fatcat:yaffuf6edvbzjjkxxwf7lskzmy

Generic LSH Families for the Angular Distance Based on Johnson-Lindenstrauss Projections and Feature Hashing LSH [article]

Luis Argerich, Natalia Golmar
2017 arXiv   pre-print
Our tests using real datasets show that the proposed LSH functions work well for the euclidean distance.  ...  We show that feature hashing is a valid J-L projection and propose two new LSH families based on feature hashing.  ...  This means FH is used to project from d to m dimensions and then a random rotation is used from m to d dimensions.  ... 
arXiv:1704.04684v1 fatcat:hmi6ggvsvvhjbb26vxhovxzawy

Expander-like Codes based on Finite Projective Geometry [article]

Swadesh Choudhary, Hrishikesh Sharma, B. S. Adiga, Sachin Patkar
2012 arXiv   pre-print
The code is based on a bipartite graph derived from the subsumption relations of finite projective geometry, and Reed-Solomon codes as component codes.  ...  By derivation of geometric bounds rather than eigenvalue bounds, it has been proved that for practical values of the code rate, the random error correction capability of our codes is much better than those  ...  This bipartite graph is derived from point-hyperplane incidence relations of projective spaces of higher dimensions than those suggested by [1] .  ... 
arXiv:1209.3460v1 fatcat:g7a4c5idhjaqhfsxqbx3l7xdty

The Overlay of Minimization Diagrams in a Randomized Incremental Construction

Haim Kaplan, Edgar Ramos, Micha Sharir
2011 Discrete & Computational Geometry  
In a randomized incremental construction of the minimization diagram of a collection of n hyperplanes in R d , the hyperplanes are inserted one by one, in a random order, and the minimization diagram is  ...  E H and M H can be constructed using a standard randomized incremental algorithm [9] . In this approach, we draw a random permutation of H, and insert the hyperplanes of H one by one in this order.  ...  The minimization diagram M H of H is the projection of E H onto the hyperplane x d = 0.  ... 
doi:10.1007/s00454-010-9324-6 fatcat:oe63l7b3uvaevds3e4nszr6fli

Maximum Margin Discriminant Projections For Facial Expression Recognition

Symeon Nikitidis, I. Pitas, Anastasios Tefas
2013 Zenodo  
facial images using random projections.  ...  MMDP algorithm directly operates on the random features extracted using an orthogonal Gaussian random projection matrix and derives an optimal projection matrix such that the separating margin between  ... 
doi:10.5281/zenodo.43668 fatcat:a74jtyfpyfbefjexmbijinvzy4

Solving Linear SVMs with Multiple 1D Projections

Johannes Schneider, Jasmina Bogojeska, Michail Vlachos
2014 Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management - CIKM '14  
Our solution adapts on methodologies from random projections, exponential search, and coordinate descent.  ...  We present a new methodology for solving linear Support Vector Machines (SVMs) that capitalizes on multiple 1D projections.  ...  Using Random Projections: In our approach we use multiple 1D random projections.  ... 
doi:10.1145/2661829.2661994 dblp:conf/cikm/SchneiderBV14 fatcat:s3or76ldzvhdzdwt4jwh3khmfy

Learning Kernel Perceptrons on Noisy Data Using Random Projections [chapter]

Guillaume Stempfel, Liva Ralaivola
2007 Lecture Notes in Computer Science  
Our proposed approach relies on the combination of the technique of random or deterministic projections with a classification noise tolerant perceptron learning algorithm that assumes distributions defined  ...  Provided a sufficient separation margin characterizes the problem, this strategy makes it possible to envision the learning from a noisy distribution in any separable Hilbert space, regardless of its dimension  ...  KPCA (almost) always requires a smaller dimension of projection than KGS and random projection.  ... 
doi:10.1007/978-3-540-75225-7_27 fatcat:jlrodqj4i5gzxjjmwnfw5qdo2y

Polytopes, Lattices, and Spherical Codes for the Nearest Neighbor Problem

Thijs Laarhoven, Emanuela Merelli, Anuj Dawar, Artur Czumaj
2020 International Colloquium on Automata, Languages and Programming  
We study locality-sensitive hash methods for the nearest neighbor problem for the angular distance, focusing on the approach of first projecting down onto a random low-dimensional subspace, and then partitioning  ...  We provide lower bounds based on spherical caps, and we predict that in higher dimensions, larger spherical codes exist which outperform orthoplices in terms of the query exponent, and we argue why using  ...  than using random Gaussian projection matrices.  ... 
doi:10.4230/lipics.icalp.2020.76 dblp:conf/icalp/Laarhoven20 fatcat:pk465duvebaoldcu4nikztjjki

A Randomized Algorithm for Large Scale Support Vector Learning

Krishnan Kumar, Chiru Bhattacharyya, Ramesh Hariharan
2007 Neural Information Processing Systems  
The key contribution of the paper is to show that, by using ideas random projections, the minimal number of support vectors required to solve almost separable classification problems, such that the solution  ...  This paper investigates the application of randomized algorithms for large scale SVM learning.  ...  d, using ideas from random projections.  ... 
dblp:conf/nips/KumarBH07 fatcat:adg5pryrejgalek63txgrnw5hu

APPROXIMATING CENTER POINTS WITH ITERATIVE RADON POINTS

KENNETH L. CLARKSON, DAVID EPPSTEIN, GARY L. MILLER, CARL STURTIVANT, SHANG-HUA TENG
1996 International journal of computational geometry and applications  
Our algorithm has been used in mesh partitioning methods and can be used in the construction of high breakdown estimators for multivariate datasets in statistics.  ...  Introduction A center point of a set P of n points in IR d is a point c of IR d such that every hyperplane passing through c partitions P into two subsets each of size at most nd/(d + 1) [9, 27] .  ...  This algorithm is analyzed first in one dimension, in Section 4, then in general in Section 5. Section 6 discusses the use of random sampling to eliminate dependence on the number of input points.  ... 
doi:10.1142/s021819599600023x fatcat:tp4mz4vz2bf6rf2oujhinqxnfi

The L1-norm best-fit hyperplane problem

J.P. Brooks, J.H. Dulá
2013 Applied Mathematics Letters  
We formalize an algorithm for solving the L 1 -norm best-fit hyperplane problem derived using first principles and geometric insights about L 1 projection and L 1 regression.  ...  This analysis of the L 1 -norm best-fit hyperplane problem makes the procedure accessible to applications in areas such as location theory, computer vision, and multivariate statistics.  ...  It is derived and motivated directly from fundamental geometric insights about L 1 projections.  ... 
doi:10.1016/j.aml.2012.03.031 pmid:23024460 pmcid:PMC3459998 fatcat:dna2yuvvlfgondam7bgny4sngy

Decomposing arrangements of hyperplanes: VC-dimension, combinatorial dimension, and point location [article]

Esther Ezra, Sariel Har-Peled, Haim Kaplan, Micha Sharir
2017 arXiv   pre-print
Our main application is to point location in an arrangement of $n$ hyperplanes is $\Re^d$, in which we show that the query cost in Meiser's algorithm can be improved if one uses vertical decomposition  ...  We discuss the tradeoff between query cost and storage (in both approaches, the one using bottom-vertex trinagulation and the one using vertical decomposition).  ...  Instead, we use the smaller combinatorial dimension, via the Clarkson-Shor analysis technique. See the table in Figure 5 .5 for a summary of the various bounds, as derived earlier in this paper.  ... 
arXiv:1712.02913v1 fatcat:rfd5esufrbftfnpxf4l4s22bb4

Maximum Margin Projection Subspace Learning for Visual Data Analysis

Symeon Nikitidis, Anastasios Tefas, Ioannis Pitas
2014 IEEE Transactions on Image Processing  
The proposed method is an iterative alternate optimization algorithm that computes the maximum margin projections exploiting the separating hyperplanes obtained from training a support vector machine classifier  ...  The proposed method called maximum margin projection pursuit, aims to identify a low dimensional projection subspace, where samples form classes that are better discriminated, i.e., are separated with  ...  The resulting from each facial image 48, 000D feature vectors were used in order to train MMPP and obtain the projection matrix R of dimensions 100 × 48, 000.  ... 
doi:10.1109/tip.2014.2348868 pmid:25148664 fatcat:vtmo3xxupjbjtdxivxhuwazp3y

Intrinsic Volumes of Polyhedral Cones: A Combinatorial Perspective

Dennis Amelunxen, Martin Lotz
2017 Discrete & Computational Geometry  
Direct derivations of the general Steiner formula, the conic analogues of the Brianchon-Gram-Euler and the Gauss-Bonnet relations, and the principal kinematic formula are given.  ...  In addition, a connection between the characteristic polynomial of a hyperplane arrangement and the intrinsic volumes of the regions of the arrangement, due to Klivans and Swartz, is generalized and some  ...  The work in this paper was partially supported by a grant from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. CityU 21203315).  ... 
doi:10.1007/s00454-017-9904-9 fatcat:ye2v4s4qhrg3bjbka7seh63txy
« Previous Showing results 1 — 15 out of 20,014 results