Filters








1,290 Hits in 3.1 sec

Unsupervised similarity learning through Cartesian product of ranking references

Lucas Pascotti Valem, Daniel Carlos Guimarães Pedronette, Jurandy Almeida
2017 Pattern Recognition Letters  
A novel method, called Cartesian Product of Ranking References (CPRR), is proposed with this objective in this paper.  ...  In this scenario, similarity learning approaches capable of improving the effectiveness of retrieval in an unsupervised way are indispensable.  ...  In this paper, we present a novel unsupervised similarity learning method for improving the effectiveness of multimedia retrieval tasks, named as Cartesian product of ranking references (CPRR).  ... 
doi:10.1016/j.patrec.2017.10.013 fatcat:q2gbx7ofgvatrjyxriwtfz7t4y

Probabilistic Inference of Biological Networks via Data Integration

Mark F. Rogers, Colin Campbell, Yiming Ying
2015 BioMed Research International  
We use pairwise kernels to predict novel links, along with multiple kernel learning to integrate distinct sources of data into a decision function.  ...  Here we consider supervised interactive network inference in which a reference set of known network links and nonlinks is used to train a classifier for predicting new links.  ...  Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper.  ... 
doi:10.1155/2015/707453 pmid:25874225 pmcid:PMC4385617 fatcat:cqkor2bl5rcjxfqnsabbqd4nxm

The structure of integral dimensions: Contrasting topological and Cartesian representations

Matt Jones, Robert L. Goldstone
2013 Journal of Experimental Psychology: Human Perception and Performance  
The Cartesian and topological models are tested in a series of experiments using the perceptual-learning phenomenon of dimension differentiation, whereby discrimination training with integral-dimension  ...  Under the present task design, the 2 models make contrasting predictions regarding the analytic representation that will be learned. Results consistently support the Cartesian model.  ...  product of the individual dimensions.  ... 
doi:10.1037/a0029059 pmid:22799263 fatcat:7cyltku4wvfezoivafa244j24m

A Survey on Learning to Hash [article]

Jingdong Wang, Ting Zhang, Jingkuan Song, Nicu Sebe, Heng Tao Shen
2017 arXiv   pre-print
In this paper, we present a comprehensive survey of the learning to hash algorithms, categorize them according to the manners of preserving the similarities into: pairwise similarity preserving, multiwise  ...  Learning to hash is one of the major solutions to this problem and has been widely studied recently.  ...  ACKNOWLEDGEMENTS This work was partially supported by the National Nature Science Foundation of China No. 61632007.  ... 
arXiv:1606.00185v2 fatcat:j5mnu7lfmvby5pfkg5pffk2nae

Multiple SOFMs Working Cooperatively In a Vote-based Ranking System For Network Intrusion Detection

Charlie Obimbo, Haochen Zhou, Ryan Wilson
2011 Procedia Computer Science  
This paper introduces a vote-based ranking system for intrusion detection based on SOFM.  ...  Recent examples of victims include the recent repeated hacking of Sony PS3, which involved 24.6 million customer accounts being vulnerable, and the hacking of websites both including US and Canadian government  ...  For instance, a raw data vector goes through the 10 SOFMs, and gets the highest rank on probe, and then this connection is probe attack.  ... 
doi:10.1016/j.procs.2011.08.041 fatcat:fcimpxb3i5afpfakbqvxw2chkq

Identification of functionally related enzymes by learning-to-rank methods [article]

Michiel Stock, Thomas Fober, Eyke Hüllermeier, Serghei Glinca, Gerhard Klebe, Tapio Pahikkala, Antti Airola, Bernard De Baets, Willem Waegeman
2014 arXiv   pre-print
In this work we show that rankings of that kind can be substantially improved by applying kernel-based learning algorithms.  ...  For a given query, the search operation results in a ranking of the enzymes in the database, from very similar to dissimilar enzymes, while information about the biological function of annotated database  ...  T.P. and A.A. are both supported for this work by the Academy of Finland (grant 134020 and 128061, respectively).  ... 
arXiv:1405.4394v1 fatcat:4a6imze6rvciblicls7uu4nqqy

Deep Image Clustering with Tensor Kernels and Unsupervised Companion Objectives [article]

Daniel J. Trosten, Michael C. Kampffmeyer, Robert Jenssen
2020 arXiv   pre-print
These unsupervised companion objectives are constructed based on a proposed generalization of the Cauchy-Schwarz (CS) divergence, from vectors to tensors of arbitrary rank.  ...  The cluster structure is enforced through the idea of unsupervised companion objectives, where separate loss functions are attached to layers in the network.  ...  INTRODUCTION Deep clustering is a subfield of deep learning [1] which considers the design of unsupervised loss functions, in order to train deep learning models for clustering.  ... 
arXiv:2001.07026v2 fatcat:ceoyqqy27jbc3cmqpk6ls6nkgm

Transform Learning for Magnetic Resonance Image Reconstruction: From Model-based Learning to Building Neural Networks [article]

Bihan Wen, Saiprasad Ravishankar, Luke Pfister, Yoram Bresler
2019 arXiv   pre-print
We discuss the connections between transform learning and convolutional or filterbank models and corresponding multi-layer extensions, as well as connections to unsupervised and supervised deep learning  ...  This paper provides a review of key works in MRI reconstruction from limited data, with focus on the recent class of TL-based reconstruction methods.  ...  the M − 1 patches most similar to the reference patch P i x and forms a matrix, whose columns are the reference patch and its matched partners (ordered by degree of match).  ... 
arXiv:1903.11431v1 fatcat:lol4w3mdjjby3pgub6tjqitwxq

Deep Supervised Quantization by Self-Organizing Map

Min Wang, Wengang Zhou, Qi Tian, Junfu Pu, Houqiang Li
2017 Proceedings of the 2017 ACM on Multimedia Conference - MM '17  
The experiments on several public standard datasets prove the superiority of our approach over the existing ANN search methods.  ...  With the supervised quantization loss, we minimize the differences on the maps between similar image pairs, and maximize the differences on the maps between dissimilar image pairs.  ...  As an efficient and scalable ANN search method, Product Quantization [6] decomposes the space into a Cartesian product of low-dimensional subspaces and quantizes each subspace separately.  ... 
doi:10.1145/3123266.3123415 dblp:conf/mm/WangZTPL17 fatcat:4kjtvt7e7bb5ravxckg6jpphgq

Unsupervised Model Selection for Variational Disentangled Representation Learning [article]

Sunny Duan, Loic Matthey, Andre Saraiva, Nicholas Watters, Christopher P. Burgess, Alexander Lerchner, Irina Higgins
2020 arXiv   pre-print
We show that our approach performs comparably to the existing supervised alternatives across 5,400 models from six state of the art unsupervised disentangled representation learning model classes.  ...  Our approach, Unsupervised Disentanglement Ranking (UDR), leverages the recent theoretical results that explain why variational autoencoders disentangle (Rolinek et al, 2019), to quantify the quality of  ...  The generative process for this dataset is fully deterministic, resulting in 737,280 total images produced from the Cartesian product of the generative factors. 3D Shapes A more complex dataset for evaluating  ... 
arXiv:1905.12614v4 fatcat:v4sele6e7rd6jc3r7zsmlf6txq

ivis Dimensionality Reduction Framework for Biomacromolecular Simulations [article]

Hao Tian, Peng Tao
2020 arXiv   pre-print
Moreover, ivis framework is capable of providing new prospective for deciphering residue-level protein allostery through the feature weights in the neural network.  ...  Compared with other methods, ivis is shown to be superior in constructing Markov state model (MSM), preserving information of both local and global distances and maintaining similarity between high dimension  ...  Acknowledgement Research reported in this paper was supported by the National Institute of General  ... 
arXiv:2004.10718v2 fatcat:qhid67dx4rgnxisff7dlftctqe

SUBIC: A supervised, structured binary code for image search [article]

Himalaya Jain, Joaquin Zepeda, Patrick Pérez, Rémi Gribonval
2017 arXiv   pre-print
Yet, unlike binary hashing schemes, these unsupervised methods have not yet benefited from the supervision, end-to-end learning and novel architectures ushered in by the deep learning revolution.  ...  Structured vector quantizers based on product quantization and its variants are usually employed to achieve such compression while minimizing the loss of accuracy.  ...  Accordingly, the database images are ranked using the similarity score (z * ) b (j) .  ... 
arXiv:1708.02932v1 fatcat:hzaxdpj3r5cdlpx5aoc3j26b34

Unsupervised Neural Quantization for Compressed-Domain Similarity Search [article]

Stanislav Morozov, Artem Babenko
2019 arXiv   pre-print
We tackle the problem of unsupervised visual descriptors compression, which is a key ingredient of large-scale image retrieval systems.  ...  While the deep learning machinery has benefited literally all computer vision pipelines, the existing state-of-the-art compression methods employ shallow architectures, and we aim to close this gap by  ...  The modification of PQ corresponding to such pre-processing transformation is referred below as Optimized Product Quantization (OPQ). Non-orthogonal quantizations.  ... 
arXiv:1908.03883v1 fatcat:3aklfr6eiba5jndscoqwd6khjy

Place recognition survey: An update on deep learning approaches [article]

Tiago Barros, Ricardo Pereira, Luís Garrote, Cristiano Premebida, Urbano J. Nunes
2021 arXiv   pre-print
Some lessons learned from this survey include: the importance of NetVLAD for supervised end-to-end learning; the advantages of unsupervised approaches in place recognition, namely for cross-domain applications  ...  learning (DL) frameworks.  ...  triplet ranking; Unsupervised: MK-MMD [161] Single and cross-domain VPR Mapillary 1 Beeldbank 2 [162] Supervised: adversarial learning Unsupervised: autoencoder Adversarial Learning: Least  ... 
arXiv:2106.10458v2 fatcat:hbw47qq2mjhsjhfb5t5vw4wfce

Supervised Quantization for Similarity Search

Xiaojuan Wang, Ting Zhang, Guo-Jun Qi, Jinhui Tang, Jingdong Wang
2016 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
The experiments on several standard datasets show the superiority of our approach over the state-of-the art supervised hashing and unsupervised quantization algorithms.  ...  In this paper, we address the problem of searching for semantically similar images from a large database. We present a compact coding approach, supervised quantization.  ...  Acknowledgements This work was partially supported by the National Basic Research Program of China (973 Program) under Grant 2014CB347600.  ... 
doi:10.1109/cvpr.2016.222 dblp:conf/cvpr/WangZQTW16 fatcat:j6jk3ellsvhhhes6i6i4rr73pa
« Previous Showing results 1 — 15 out of 1,290 results