Filters








21,999 Hits in 3.8 sec

Learning Hash Functions Using Column Generation [article]

Xi Li and Guosheng Lin and Chunhua Shen and Anton van den Hengel and Anthony Dick
2013 arXiv   pre-print
The learning procedure is implemented using column generation and hence is named CGHash. At each iteration of the column generation procedure, the best hash function is selected.  ...  In this work, we propose a column generation based method for learning data-dependent hash functions on the basis of proximity comparison information.  ...  We show that column generation can be used to iteratively find the optimal hash functions.  ... 
arXiv:1303.0339v1 fatcat:wgeivpdqfbhzlalg5ugzv4rmqm

Structured Learning of Binary Codes with Column Generation [article]

Guosheng Lin, Fayao Liu, Chunhua Shen, Jianxin Wu, Heng Tao Shen
2016 arXiv   pre-print
Our method iteratively learns the best hash functions during the column generation procedure.  ...  Hashing has proven a valuable tool for large-scale information retrieval. We propose a column generation based binary code learning framework for data-dependent hash function learning.  ...  We here exploit the column generation technique for hash functions learning.  ... 
arXiv:1602.06654v1 fatcat:n5e3xtu4p5g23jplrm4jmrs54i

Optimizing Ranking Measures for Compact Binary Code Learning [article]

Guosheng Lin, Chunhua Shen, Jianxin Wu
2014 arXiv   pre-print
To solve the StructHash optimization problem, we use a combination of column generation and cutting-plane techniques.  ...  Despite much success, existing hashing methods optimize over simple objectives such as the reconstruction error or graph Laplacian related loss functions, instead of the performance evaluation criteria  ...  We are also inspired by the recent column generation based hashing method, column generation hashing (CGH) [12] , which iteratively learns hash functions using column generation.  ... 
arXiv:1407.1151v1 fatcat:4r6jgnmz7jabnaz77gsjvtxrui

Optimizing Ranking Measures for Compact Binary Code Learning [chapter]

Guosheng Lin, Chunhua Shen, Jianxin Wu
2014 Lecture Notes in Computer Science  
To solve the StructHash optimization problem, we use a combination of column generation and cutting-plane techniques.  ...  Despite much success, existing hashing methods optimize over simple objectives such as the reconstruction error or graph Laplacian related loss functions, instead of the performance evaluation criteria  ...  Extension to make use of the power of more sophisticated hash functions such as kernel functions or decision trees is of future work.  ... 
doi:10.1007/978-3-319-10578-9_40 fatcat:w64ikcjarjfo5micpazhu35wee

Maximum Variance Hashing via Column Generation

Lei Luo, Chao Zhang, Yongrui Qin, Chunyuan Zhang
2013 Mathematical Problems in Engineering  
To solve the derived optimization problem, we propose a column generation algorithm, which directly learns the binary-valued hash functions.  ...  Recently, a number of data-dependent methods have been developed, reflecting the great potential of learning for hashing.  ...  Output: The learnt hash functions H, their weights w and the binary codes Y. Algorithm 1: MVH-CG: Column generation for maximum variance hashing.  ... 
doi:10.1155/2013/379718 fatcat:ztdc6jjmfnbtjpy2ftrw3cprky

Image retrieval with query-adaptive hashing

Dong Liu, Shuicheng Yan, Rong-Rong Ji, Xian-Sheng Hua, Hong-Jiang Zhang
2013 ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)  
The existing semantic-preserving hashing methods leverage the labeled data to learn a fixed set of semantic-aware hash functions.  ...  At query time, we further use the sparsity representation procedure to select the most appropriate hash function subset that is informative to the semantic information conveyed by the query.  ...  The work in Shakhnarovich et al. [2007] used stacked Restricted Boltzmann Machine (RBM) to learn the hash functions in a supervised manner, and showed to generate compact binary codes with semantic information  ... 
doi:10.1145/2422956.2422958 fatcat:h3tid3s2bjeglcdwdpyalrzubi

Hadamard Codebook Based Deep Hashing [article]

Shen Chen, Liujuan Cao, Mingbao Lin, Yan Wang, Xiaoshuai Sun, Chenglin Wu, Jingfei Qiu, Rongrong Ji
2019 arXiv   pre-print
of the hash functions learning.  ...  Besides, the proposed HCDH further exploits the supervised labels by constructing a classifier on top of the outputs of hash functions.  ...  Hence, by eliminating the first row or the first column, Hadamard matrix can be used as an efficient codebook for learning hash codes (referred Hadamard codebook).  ... 
arXiv:1910.09182v1 fatcat:cmfuvwbmjzetpnmiyuvcnolscy

Supervised Binary Hash Code Learning with Jensen Shannon Divergence

Lixin Fan
2013 2013 IEEE International Conference on Computer Vision  
This paper proposes to learn binary hash codes within a statistical learning framework, in which an upper bound of the probability of Bayes decision errors is derived for different forms of hash functions  ...  Consequently, minimizing such an upper bound leads to consistent performance improvements of existing hash code learning algorithms, regardless of whether original algorithms are unsupervised or supervised  ...  This definition of B-subsets is generic and accommodates to different families of hash functions such as linear transform, kernelized or more complex hash functions used e.g. in [5, 26, 12] .  ... 
doi:10.1109/iccv.2013.325 dblp:conf/iccv/Fan13 fatcat:egy57taxbfby5hvqznnbxsiuha

Deep Cross-Modal Hashing with Hashing Functions and Unified Hash Codes Jointly Learning [article]

Rong-Cheng Tu, Xian-Ling Mao, Bing Ma, Yong Hu, Tan Yan, Wei Wei and Heyan Huang
2019 arXiv   pre-print
With the iterative optimization algorithm, the learned unified hash codes can be used to guide the hashing function learning procedure; Meanwhile, the learned hashing functions can feedback to guide the  ...  Generally, compared with shallow cross-modal hashing methods, deep cross-modal hashing methods can achieve a more satisfactory performance by integrating feature learning and hash codes optimizing into  ...  Moreover, in order to make the learned hash codes of instances in the database and the hash codes of query data-points generated by the learned hashing functions can preserve the semantic similarity, one  ... 
arXiv:1907.12490v1 fatcat:yikemhnovjcfrlxkg5w5mrq2zm

Augmented hashing for semi-supervised scenarios

Zalán Bodó, Lehel Csató
2014 The European Symposium on Artificial Neural Networks  
In this paper we propose a generic procedure to extend unsupervised codeword generators using error correcting codes and semisupervised classifiers.  ...  Embedding the points into the Hamming space is an important question of the hashing process. Analogously to machine learning there exist unsupervised, supervised and semi-supervised hashing methods.  ...  Linear spectral hashing and Laplacian regularized least squares This section briefly presents linear spectral hashing for hash codeword generation and two linear semi-supervised learning methods.  ... 
dblp:conf/esann/BodoC14 fatcat:6wd7uhsqrbesnmxjhnooaokn6m

Asymmetric Deep Supervised Hashing [article]

Qing-Yuan Jiang, Wu-Jun Li
2017 arXiv   pre-print
More specifically, ADSH learns a deep hash function only for query points, while the hash codes for database points are directly learned.  ...  However, most existing deep supervised hashing methods adopt a symmetric strategy to learn one deep hash function for both query points and database (retrieval) points.  ...  which use the learned hash function to generate hash codes for database points  ... 
arXiv:1707.08325v1 fatcat:l7yfbacxsjdrrdxgad4xkx37h4

Scalable Discrete Supervised Hash Learning with Asymmetric Matrix Factorization [article]

Shifeng Zhang, Jianmin Li, Jinma Guo, Bo Zhang
2016 arXiv   pre-print
However, the existing limitations make the present algorithms difficult to deal with large-scale datasets: (1) discrete constraints are involved in the learning of the hash function; (2) pairwise or triplet  ...  The proposed framework also provides a flexible paradigm to incorporate with arbitrary hash function, including deep neural networks and kernel methods.  ...  Discrete Hashing Methods The goal of hash learning is to learn certain hash functions with given training data, and the hashcodes are generated by the learned hash function.  ... 
arXiv:1609.08740v1 fatcat:dpbs4eayefarxpqwjxkaxiiaf4

Sparse hashing for fast multimedia search

Xiaofeng Zhu, Zi Huang, Hong Cheng, Jiangtao Cui, Heng Tao Shen
2013 ACM Transactions on Information Systems  
To efficiently and effectively encode unseen data, SH learns hash functions by taking a-priori knowledge into account, such as implicit group effect of the features in training data, and the correlations  ...  between original space and the learned Hamming space.  ...  SH SVM implements Algorithm 1 and uses SVM for learning hash functions.  ... 
doi:10.1145/2457465.2457469 fatcat:sawxwotnyzg2xeh7pu74gy4z6q

Asymmetric Deep Semantic Quantization for Image Retrieval [article]

Zhan Yang, Osolo Ian Raymond, WuQing Sun, Jun Long
2019 arXiv   pre-print
The two ImgNets are used to generate discriminative compact hash codes.  ...  By using deep learning based techniques, hashing can outperform non-learning based hashing in many applications.  ...  In the future, we will use two asymmetric networks with different structures to generate high-quality hash codes. Figure 1 : 1 The basic architecture of supervised learning based hashing.  ... 
arXiv:1903.12493v1 fatcat:lgeeqweezjeabhfp35ibhetfte

Hashing with Generalized Nyström Approximation

Jeong-Min Yun, Saehoon Kim, Seungjin Choi
2012 2012 IEEE 12th International Conference on Data Mining  
In this paper we address the use of generalized Nyström method where a subset of rows and columns are used to approximately compute leading singular vectors of the data matrix, in order to improve the  ...  Especially we validate the useful behavior of generalized Nyström approximation with uniform sampling, in the case of a recentlydeveloped hashing method based on principal component analysis (PCA) followed  ...  Semi-supervised learning, which uses both plenty of unlabeled examples and small number of labeled examples, has also been applied to hashing [5] , [6] , where hash function is mainly learned from labeled  ... 
doi:10.1109/icdm.2012.22 dblp:conf/icdm/YunKC12 fatcat:ewio6peoyzfwfawlezptadboa4
« Previous Showing results 1 — 15 out of 21,999 results