Online Adaptive Supervised Hashing for Large-Scale Cross-Modal Retrieval

Ruoqi Su, Di Wang, Zhen Huang, Yuan Liu, Yaqiang An
2020 IEEE Access  
In recent years, with the continuous growth of multimedia data on the Internet, multimodal hashing has attracted increasing attention for its efficiency in large-scale cross-modal retrieval. Typically, most existing multimodal hashing methods are batch-based methods that cannot deal with the growing streaming data. Online multimodal hashing adopts online learning strategy to learn hash models incrementally, which can process large-scale streaming data. However, existing supervised online
more » ... dal hashing methods still suffer from several limitations: (1) most methods cannot update hash codes of old data when hash model changes, which will hinder the retrieval accuracy. (2) the discrete optimizations of most methods for learning binary hash codes are less efficient or less effective. To address the above limitations, an efficient supervised online multimodal hashing method termed Online Adaptive Supervised Hashing (OASH) is proposed in this paper. OASH regresses class labels and training data to binary hash codes to learn discrete hash codes and hash functions with an efficient online optimization scheme. It learns hash functions incrementally with newly arriving data and updates hash codes with the latest hash model. By adaptively updating hash functions and hash codes with new data rather than accessing to old data, it gains much performance enhancement and computation savings. Extensive experiments on three benchmark data sets validate the superiority of OASH over state-of-the-art methods in both accuracy and efficiency. INDEX TERMS Multimodal hashing, cross-modal retrieval, online learning.
doi:10.1109/access.2020.3037968 fatcat:hj2e3wiy35cufpwzpbzv2miure