Filters








201 Hits in 1.0 sec

Coreset-Based Adaptive Tracking [article]

Abhimanyu Dubey, Nikhil Naik, Dan Raviv, Rahul Sukthankar, Ramesh Raskar
2015 arXiv   pre-print
We learn an adaptive object appearance model from the coreset tree in constant time and logarithmic space and use it for object tracking by detection.  ...  This coreset based learning approach can be applied for both real-time learning of small, varied data and fast learning of big data.  ...  Our work contributes to this literature with a novel coreset representation for model-based tracking.  ... 
arXiv:1511.06147v1 fatcat:zujkqnzvnjgytgarjfdanibhoa

Improving Pedestrian Prediction Models with Self-Supervised Continual Learning [article]

Luzia Knoedler, Chadi Salmi, Hai Zhu, Bruno Brito, Javier Alonso-Mora
2022 arXiv   pre-print
Autonomous mobile robots require accurate human motion predictions to safely and efficiently navigate among pedestrians, whose behavior may adapt to environmental changes.  ...  In particular, we exploit online streams of pedestrian data, commonly available from the robot's detection and tracking pipeline, to refine the prediction model and its performance in unseen scenarios.  ...  Then, the model is adapted using D k and a set containing examples of previous tasks referred to as coreset D coreset . The Coreset Rehearsal strategy is applied to mitigate forgetting.  ... 
arXiv:2202.07606v1 fatcat:2b2koisvdngotnjumlu4rzwh5i

Coresets for visual summarization with applications to loop closure

Mikhail Volkov, Guy Rosman, Dan Feldman, John W. Fisher, Daniela Rus
2015 2015 IEEE International Conference on Robotics and Automation (ICRA)  
We demonstrate how to utilize the resulting structure for efficient loop-closure by a novel sampling approach that is adaptive to the structure of the video.  ...  Using a highly efficient compression coreset method, we formulate a new method for hierarchical retrieval of frames from large video streams collected online by a moving robot.  ...  We note that unlike coresets for clustering such as the one used in [21] , parallel computation requires us to keep track of the time associated with the datapoints sent to each processor.  ... 
doi:10.1109/icra.2015.7139704 dblp:conf/icra/VolkovRFFR15 fatcat:5vgaovm62fanln54clq2ootws4

Visualization of Big Spatial Data using Coresets for Kernel Density Estimates [article]

Yan Zheng, Yi Ou, Alexander Lex, Jeff M. Phillips
2017 arXiv   pre-print
We also introduce a method to ensure that thresholding of low values based on sampled data does not omit any regions above the desired threshold when working with sampled data.  ...  One approach is to make that threshold adaptive.  ...  Analysts can specify the error threshold ε, based on which the system automatically generates a coreset or a random sample based on ε.  ... 
arXiv:1709.04453v1 fatcat:b7o2vlfv2vapplog6g4tou4ct4

K-robots clustering of moving sensors using coresets

Dan Feldman, Stephanie Gil, Ross A. Knepper, Brian Julian, Daniela Rus
2013 2013 IEEE International Conference on Robotics and Automation  
We prove that both the size of our coreset and its update time is polynomial in k log(n)/ε.  ...  Our primary contribution is an algorithm to compute and maintain a small representative set, called a kinematic coreset, of the n moving clients.  ...  Our approach is to determine the kinematic coreset, or sparse representative set, of clients and control the servers to track this coreset.  ... 
doi:10.1109/icra.2013.6630677 dblp:conf/icra/FeldmanGKJR13 fatcat:u7sjv7lf5vdenooklcd4zb3k5m

Bayesian Coreset Construction via Greedy Iterative Geodesic Ascent [article]

Trevor Campbell, Tamara Broderick
2018 arXiv   pre-print
To address this shortcoming, we develop greedy iterative geodesic ascent (GIGA), a novel algorithm for Bayesian coreset construction that scales the coreset log-likelihood optimally.  ...  We demonstrate that these algorithms scale the coreset log-likelihood suboptimally, resulting in underestimated posterior uncertainty.  ...  There are two options for practical implementation: either (A) we keep track of only w t , recomputing and normalizing (w t ) at each iteration to obtain w t+1 ; or (B) we keep track of and store both  ... 
arXiv:1802.01737v2 fatcat:myl4ewuszrfdrgr76gxf4zoa7a

Approximate Clustering on Distributed Data Streams

Qi Zhang, Jinze Liu, Wei Wang
2008 2008 IEEE 24th International Conference on Data Engineering  
monitoring, location-tracking services, etc.  ...  These two coreset-based streaming k-median algorithms achieve a better space-bound than Guha's algorithm [12] mentioned earlier.  ... 
doi:10.1109/icde.2008.4497522 dblp:conf/icde/ZhangLW08 fatcat:uc4sefkn2jg4fim7lktluirtk4

Sharing Models or Coresets: A Study based on Membership Inference Attack [article]

Hanlin Lu, Changchang Liu, Ting He, Shiqiang Wang, Kevin S. Chan
2020 arXiv   pre-print
: collecting and aggregating local models (federated learning) and collecting and training over representative data summaries (coreset).  ...  Distributed machine learning generally aims at training a global model based on distributed data without collecting all the data to a centralized location, where two different approaches have been proposed  ...  Each approach has been independently studied, such as adaptive federated learning , communication-efficient coreset construction (Balcan et al., 2013) , and the robustness of coreset in supporting multiple  ... 
arXiv:2007.02977v1 fatcat:qsovg45ejzgczlinjl5spb6mfe

Fast Streaming k-Means Clustering with Coreset Caching

Yu Zhang, Kanat Tangwongsan, Srikanta Tirthapura
2020 IEEE Transactions on Knowledge and Data Engineering  
coreset caching.  ...  We also present an algorithm called OnlineCC that integrates the coreset caching idea with a simple sequential streaming k-means algorithm.  ...  This caching design helps the clustering system to adapt in the faces of both burst queries and occasional queries.  ... 
doi:10.1109/tkde.2020.3018744 fatcat:25fk5gpc7fbrdpazhzw2kxjv2a

Coresets for Minimum Enclosing Balls over Sliding Windows [article]

Yanhao Wang, Yuchen Li, Kian-Lee Tan
2019 arXiv   pre-print
Coresets are important tools to generate concise summaries of massive datasets for approximate analysis.  ...  The proposed algorithms also support coresets for MEB in a reproducing kernel Hilbert space (RKHS).  ...  The errors of coreset-based algorithms are acquired based on the definition of coresets: we compute the minimal λ such that W t ⊂ λ · MEB(S t ), and use ε = λ ·r * (St)−r * (Wt) r * (Wt) for the relative  ... 
arXiv:1905.03718v1 fatcat:o6vu37et7vdazdbgxorfurn5q4

Robust Coreset Construction for Distributed Machine Learning [article]

Hanlin Lu, Ming-Ju Li, Ting He, Shiqiang Wang, Vijaykrishnan Narayanan, Kevin S Chan
2020 arXiv   pre-print
Although viewed as a proxy of the original dataset, each coreset is only designed to approximate the cost function of a specific machine learning problem, and thus different coresets are often required  ...  Motivated by empirical evidence that suitably-weighted k-clustering centers provide a robust coreset, we harden the observation by establishing theoretical conditions under which the coreset provides a  ...  Adaptation for coreset construction: First, we skip step (3) (i.e., computation of Q D ) and directly use D = n j=1 D j as the coreset.  ... 
arXiv:1904.05961v3 fatcat:35z27iab75fcnhs6e36en4phl4

Active Learning for Bayesian 3D Hand Pose Estimation [article]

Razvan Caramalau, Binod Bhattarai, Tae-Kyun Kim
2021 arXiv   pre-print
Thus, the learnt numerically-stable logarithmic tracks the noise present in the data.  ...  Pool-based Active Learning Strategy.  ... 
arXiv:2010.00694v2 fatcat:vu2zl5ityvgjdj3xtolhnzutoa

Training Gaussian Mixture Models at Scale via Coresets [article]

Mario Lucic and Matthew Faulkner and Andreas Krause and Dan Feldman
2018 arXiv   pre-print
Empirical evaluation on several real-world datasets suggests that our coreset-based approach enables significant reduction in training-time with negligible approximation error.  ...  A coreset is a weighted subset of the data, which guarantees that models fitting the coreset also provide a good fit for the original data set.  ...  The coreset construction algorithm is based on a simple importance sampling scheme and and has linear running time in n.  ... 
arXiv:1703.08110v2 fatcat:n4ohvdwk4ngwbeha6pljdnzvti

Small quantum computers and large classical data sets [article]

Aram W. Harrow
2020 arXiv   pre-print
These algorithms use data reduction techniques to construct a weighted subset of X called a coreset that yields approximately the same loss for each model.  ...  The coreset can be constructed by the classical computer alone, or via an interactive protocol in which the outputs of the quantum computer are used to help decide which elements of X to use.  ...  Adaptive coresets from zero-sum games.  ... 
arXiv:2004.00026v1 fatcat:inc6u2cdizcbxbajqrnjq35hxm

Sequential Graph Convolutional Network for Active Learning

Razvan Caramalau, Binod Bhattarai, Tae-Kyun Kim
2021 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
To this end, we utilise the graph node embeddings and their confidence scores and adapt sampling techniques such as CoreSet and uncertainty-based methods to query the nodes.  ...  We propose a novel pool-based Active Learning framework constructed on a sequential Graph Convolution Network (GCN).  ...  UncertainGCN is based on the standard AL method uncertainty sampling [31] which tracks the confidence scores of the designed graph nodes.  ... 
doi:10.1109/cvpr46437.2021.00946 fatcat:r65cu6zx4bgsphoz3k7knefnl4
« Previous Showing results 1 — 15 out of 201 results