A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2013; you can also visit the original URL.
The file type is application/pdf
.
Filters
Distance Metric Learning Revisited
[chapter]
2012
Lecture Notes in Computer Science
Learning an appropriate distance metric from data is usually superior to the default Euclidean distance. ...
In this paper, we revisit the original model proposed by Xing et al. [25] and propose a general formulation of learning a Mahalanobis distance from data. ...
Given some partial information of constraints, the goal of metric learning is to learn a distance metric which reports small distances for similar examples and large distances for dissimilar examples. ...
doi:10.1007/978-3-642-33460-3_24
fatcat:huafoqpvsvfabmogi7nppuon2a
Indirect Point Cloud Registration: Aligning Distance Fields using a Pseudo Third Point Ses
[article]
2022
arXiv
pre-print
Inspired by the high precision of deep learning global feature registration, we propose to combine this with distance fields. ...
We generalize the algorithm to a non-Deep Learning setting while retaining the accuracy. ...
Our non-learning method (IFR) obtained a competitive performance to the new state-of-the-art PointNetLK-Revisited [14] . ...
arXiv:2205.15954v1
fatcat:tsk3yidkabdmpkzctn5uqzmaeq
Revisiting Metric Learning for Few-Shot Image Classification
[article]
2020
arXiv
pre-print
Specifically, we revisit the classical triplet network from deep metric learning, and extend it into a deep K-tuplet network for few-shot learning, utilizing the relationship among the input samples to ...
Once trained, our network is able to extract discriminative features for unseen novel categories and can be seamlessly incorporated with a non-linear distance metric function to facilitate the few-shot ...
In this work, we revisit metric learning and investigate the potential of triplet-like feature embedding learning for few-shot classification. ...
arXiv:1907.03123v2
fatcat:swfxz3qjnbczdlkxzhe3e7d35q
Revisiting Training Strategies and Generalization Performance in Deep Metric Learning
[article]
2020
arXiv
pre-print
Deep Metric Learning (DML) is arguably one of the most influential lines of research for learning visual similarities with many proposed approaches every year. ...
To provide a consistent reference point, we revisit the most widely used DML objective functions and conduct a study of the crucial parameter choices as well as the commonly neglected mini-batch sampling ...
Supplementary: Revisiting Training Strategies and Generalization Performance in Deep Metric Learning
A. ...
arXiv:2002.08473v9
fatcat:5pfzzxzrevh7xdeglhfa4t3vje
Geometric Mean Metric Learning
[article]
2016
arXiv
pre-print
We revisit the task of learning a Euclidean metric from data. We approach this problem from first principles and formulate it as a surprisingly simple optimization problem. ...
We revisit the task of learning a Euclidean metric. ...
CONCLUSION AND FUTURE WORK We revisited the task of learning a Euclidean metric from weakly supervised data given as pairs of similar and dissimilar points. ...
arXiv:1607.05002v1
fatcat:5bfvf4a4crbtni7kejcczxfd7u
Improving GP classifier generalization using a cluster separation metric
2006
Proceedings of the 8th annual conference on Genetic and evolutionary computation - GECCO '06
Here, we revisit the design of fitness functions for genetic programming by explicitly considering the contribution of the wrapper and cost function. ...
Genetic Programming offers freedom in the definition of the cost function that is unparalleled among supervised learning algorithms. However, this freedom goes largely unexploited in previous work. ...
Table 1 : 1 Wrapper-Distance Metrics Label
Wrapper
Error Metric ...
doi:10.1145/1143997.1144159
dblp:conf/gecco/GeorgeH06
fatcat:33g47wshgvcezefhdmulo5hhbq
Page 198 of Behavior Research Methods Vol. 30, Issue 2
[page]
1998
Behavior Research Methods
This is an arbitrary but frequency normalized Euclidian distance metric. ...
The process of frequency normalization and computing the dis¬ tance metric are intertwined. The distance metric in HAL is referred to as Riverside context units. ...
Integrating metric and semantic maps for vision-only automated parking
2015
2015 IEEE International Conference on Robotics and Automation (ICRA)
The strengths of our method are threefold: the framework allows for the unsupervised evolution of both maps as the environment is revisited by the robot; it uses vision-only sensors, making it appropriate ...
However, it is not clear from the state of the art how to update the semantic layer as the metric map evolves. ...
We envisage a system by which our metric and semantic maps improve as our robots revisit previously-explored areas, as shown in Fig. 2 . ...
doi:10.1109/icra.2015.7139484
dblp:conf/icra/GrimmettBPPFPN15
fatcat:iybalja7gbdyrj7evddgvxijl4
Embedding Deep Metric for Person Re-identication A Study Against Large Variations
[article]
2016
arXiv
pre-print
In addition, we improve the learning by a metric weight constraint, so that the learned metric has a better generalization ability. ...
On the other hand, the manifold learning methods suggest to use the Euclidean distance in the local range, combining with the graphical relationship between samples, for approximating the geodesic distance ...
Weight Constraint for Metric Learning. A commonly used metric by deep learning methods is the Euclidean distance [6, 30, 27] . ...
arXiv:1611.00137v1
fatcat:bks6hcgzhvfhlj77xvo76h7yem
Exploring Locally Rigid Discriminative Patches for Learning Relative Attributes
2015
Procedings of the British Machine Vision Conference 2015
Its another variant, "LocalPair+ML", uses a learned distance metric while computing the analogous pairs. ...
In LocalPair+ML, a learned distance metric is used to give more importance to those feature dimensions that are more representative of a particular attribute while computing the analogous pairs. ...
Its another variant, "LocalPair+ML", uses a learned distance metric while computing the analogous pairs. ...
doi:10.5244/c.29.170
dblp:conf/bmvc/VermaJ15
fatcat:axbraoz3tzagbacmhlpuezqke4
A Critical Investigation of Deep Reinforcement Learning for Navigation
[article]
2019
arXiv
pre-print
We choose evaluation metrics that explicitly measure the algorithm's ability to gather and exploit map-information. ...
Deep reinforcement learning (DRL) algorithms, alternatively, approach the problem of navigation in an end-to-end fashion. ...
In addition to reporting the Latency-1:>1 metric, we introduce the Distance-inefficiency metric, the ratio of distance covered by the agent as compared to the approximate shortest path length to the goal ...
arXiv:1802.02274v2
fatcat:5kkyp62xtbdbfpxfdzmboblldm
robustica: customizable robust independent component analysis
[article]
2021
bioRxiv
pre-print
Results: We present robustica, a Python-based package to compute robust independent components with a fully customizable clustering algorithm and distance metric. ...
Here, we exploited its customizability to revisit and optimize robust ICA systematically. ...
Here, we developed robustica, the first Python package to carry out robust ICA with a fully customizable clustering metric and algorithm based on the powerful library scikit-learn 14 . ...
doi:10.1101/2021.12.10.471891
fatcat:6tht27s2bvfrhhrfko224nvove
Individual Fairness Revisited: Transferring Techniques from Adversarial Robustness
[article]
2020
arXiv
pre-print
Our contributions are twofold: First, we introduce the definition of a minimal metric and characterize the behavior of models in terms of minimal metrics. ...
We turn the definition of individual fairness on its head—rather than ascertaining the fairness of a model given a predetermined metric, we find a metric for a given model that satisfies individual fairness ...
This notion of tightness is captured by the minimality of a distance metric, defined in Definition 4. Definition 4 (Minimal distance metric). Let M be a set of distance metrics in X . ...
arXiv:2002.07738v4
fatcat:pefu7iphjrbczmxoenprvzophq
Locality analysis
2014
Performance Evaluation Review
While researchers in machine learning are developing new techniques to analyze vast quantities of sometimes unstructured data, there is another, not-so-new, form of big data analysis that has been quietly ...
A series of short revisit times (analogous to reuse distance) might indicate that a customer is planning a purchase. ...
higher-order metrics. ...
doi:10.1145/2627534.2627565
fatcat:42zt4pfeuzd6fptyv3a63lj2vi
Evidence of trapline foraging in honeybees
2016
Journal of Experimental Biology
Recent studies on bumblebees have showed how solitary foragers can learn traplines, minimising travel costs between multiple replenishing feeding locations. ...
A 10-fold increase of between-flower distances considerably intensified this routing behaviour, with bees establishing more stable and more efficient routes at larger spatial scales. ...
Manipulative experiments in bumblebees foraging on artificial flowers show how individual foragers can learn stable, repeatable traplines, minimising travel distances between feeding locations (Lihoreau ...
doi:10.1242/jeb.143214
pmid:27307487
fatcat:3sktggjyzrcandwmjvbbfv3xfm
« Previous
Showing results 1 — 15 out of 27,179 results