Filters








1,692 Hits in 2.3 sec

Robust Probabilistic Modeling with Bayesian Data Reweighting [article]

Yixin Wang, Alp Kucukelbir, David M. Blei
2018 arXiv   pre-print
Probabilistic models analyze data by relying on a set of assumptions. Data that exhibit deviations from these assumptions can undermine inference and prediction quality.  ...  Robust models offer protection against mismatch between a model's assumptions and reality. We propose a way to systematically detect and mitigate mismatch of a large class of probabilistic models.  ...  Reweighted Probabilistic Models Reweighted probabilistic models (RPM) offer a new approach to robust modeling.  ... 
arXiv:1606.03860v3 fatcat:v5iqd67k3bhhpdimz3wqf52jxu

Instance-Dependent PU Learning by Bayesian Optimal Relabeling [article]

Fengxiang He, Tongliang Liu, Geoffrey I Webb, Dacheng Tao
2020 arXiv   pre-print
In this paper, we assume that a positive example with a higher P(Y = 1|X) is more likely to be labelled and propose a probabilistic-gap based PU learning algorithms.  ...  Specifically, by treating the unlabelled data as noisy negative examples, we could automatically label a group positive and negative examples whose labels are identical to the ones assigned by a Bayesian  ...  robust.  ... 
arXiv:1808.02180v2 fatcat:yppwhqbuyveqbpq6qomrasbb7q

Sample Debiasing in the Themis Open World Database System (Extended Version) [article]

Laurel Orr, Magda Balazinska, Dan Suciu
2020 arXiv   pre-print
We leverage apriori population aggregate information to develop and combine two different approaches for automatic debiasing: sample reweighting and Bayesian network probabilistic modeling.  ...  We build a prototype of Themis and demonstrate that Themis achieves higher query accuracy than the default AQP approach, an alternative sample reweighting technique, and a variety of Bayesian network models  ...  Themis's hybrid approach merges sample reweighting with population probabilistic modeling to achieve a 70 percent improvement in the median error when compared to uniform reweighting for heavy hitter queries  ... 
arXiv:2002.09799v2 fatcat:2pyct6ektfdkjlbsmdngeuzasi

Matrix and Tensor Factorization Methods for Natural Language Processing

Guillaume Bouchard, Jason Naradowsky, Sebastian Riedel, Tim Rocktäschel, Andreas Vlachos
2015 Tutorials  
data) using gradient descent, reweighted SVD or Frank-Wolfe algorithms.  ...  We show how to interpret low-rank models as probabilistic models (Bishop, 1999) and how we can extend SVD algorithms that can factor-ize non-standard matrices (i.e. with non-Gaussian noise and missing  ...  data) using gradient descent, reweighted SVD or Frank-Wolfe algorithms.  ... 
doi:10.3115/v1/p15-5005 dblp:conf/acl/BouchardNRRV15 fatcat:q2zksc5hqfe5ffgbpv477trdu4

Page 7830 of Mathematical Reviews Vol. , Issue 99k [page]

1999 Mathematical Reviews  
Christophe Croux and Peter Filzmoser, A robust biplot rep- resentation of two-way tables (355-361); Paula Brito, Symbolic clustering of probabilistic data (385-390); Francisco de A.  ...  Dose and W. von der Linden, Model comparison with energy confinement data from large fusion experiments (137-145); V. Dose, R.  ... 

Robust Gaussian Process Regression Based on Iterative Trimming [article]

Zhao-Zhou Li, Lu Li, Zhengyi Shao
2020 arXiv   pre-print
We propose a new robust GP regression algorithm that iteratively trims a portion of the data points with the largest deviation from the predicted mean.  ...  The model prediction of the Gaussian process (GP) regression can be significantly biased when the data are contaminated by outliers.  ...  As the basis of Bayesian optimization, GP can serve as a probabilistic surrogate model for problems that demand sample efficiency.  ... 
arXiv:2011.11057v1 fatcat:qsgog6u3gvbivfyldkzmsnwhxi

Dynamic filtering of sparse signals using reweighted ℓ1

Adam S. Charles, Christopher J. Rozell
2013 2013 IEEE International Conference on Acoustics, Speech and Signal Processing  
., L1 minimization) utilize strong structural models for a single signal, but do not admit obvious ways to incorporate dynamic models for data streams.  ...  The resulting algorithm achieves very good performance, and appears to be particularly robust to errors in the dynamic signal model.  ...  As described in [13] , the RWL1 approach described above can be viewed as a Bayesian inference problem for a hierarchical probabilistic model.  ... 
doi:10.1109/icassp.2013.6638908 dblp:conf/icassp/CharlesR13 fatcat:3m65gbcmm5blrcgf46j2bx3cva

Learning to Reweight Imaginary Transitions for Model-Based Reinforcement Learning [article]

Wenzhen Huang, Qiyue Yin, Junge Zhang, Kaiqi Huang
2021 arXiv   pre-print
Model-based reinforcement learning (RL) is more sample efficient than model-free RL by using imaginary trajectories generated by the learned dynamics model.  ...  Visualization of our changing weights further validates the necessity of utilizing reweight scheme.  ...  Previously, model-based RL with linear or Bayesian models has obtained excellent performance on the simple low dimensional control problems (Abbeel, Quigley, and Ng, 2006; Deisenroth and Rasmussen, 2011  ... 
arXiv:2104.04174v1 fatcat:b2t6q7fwczfhhhtynb73gmh5ia

The Generalized LASSO

V. Roth
2004 IEEE Transactions on Neural Networks  
For regression functionals which can be modeled as iteratively reweighted least-squares (IRLS) problems, we present a highly efficient algorithm with guaranteed global convergence.  ...  This defies a unique framework for sparse regression models in the very rich class of IRLS models, including various types of robust regression models and logistic regression.  ...  SPARSE BAYESIAN KERNEL REGRESSION Applying a Bayesian method to the regression problem requires us to specify a set of probabilistic models of the data, see, e.g., [20] .  ... 
doi:10.1109/tnn.2003.809398 pmid:15387244 fatcat:65x4jj3hkfaohc5x3gxibz3dxu

Online Learning of a Probabilistic and Adaptive Scene Representation [article]

Zike Yan, Xin Wang, Hongbin Zha
2021 arXiv   pre-print
In this paper, we represent the scene with a Bayesian nonparametric mixture model, seamlessly describing per-point occupancy status with a continuous probability density function.  ...  The consistent probabilistic formulation assures a generative model that is adaptive to different sensor characteristics, and the model complexity can be dynamically adjusted on-the-fly according to different  ...  Probabilistic 3D Data Fusion in Real-time 3D sequential data are usually redundant and noisy.  ... 
arXiv:2103.16832v1 fatcat:qx3ncxz7rnhqbidvxqqvoqswoq

High Dimensional Process Monitoring Using Robust Sparse Probabilistic Principal Component Analysis [article]

Mohammad Nabhan, Yajun Mei, Jianjun Shi
2019 arXiv   pre-print
The developed monitoring technique uses robust sparse probabilistic PCA to reduce the dimensionality of the data stream while retaining interpretability.  ...  The proposed methodology utilizes Bayesian variational inference to obtain the estimates of a probabilistic representation of PCA.  ...  This is achieved by using a probabilistic model with Laplacian priors to extract robust sparse principal components.  ... 
arXiv:1904.09514v1 fatcat:bxehg6iyjnbjbeocggnpzmvs24

Relevance Feedback in the Bayesian Network Retrieval Model: An Approach Based on Term Instantiation [chapter]

Luis M. de Campos, Juan M. Fernández-Luna, Juan F. Huete
2001 Lecture Notes in Computer Science  
In this paper we are going to introduce a relevance feedback method for the Bayesian Network Retrieval Model, founded on propagating partial evidences in the underlying Bayesian network.  ...  of the probabilistic IR model, because they offer important advantages to deal with the intrinsic uncertainty with which IR is pervaded [5, 14] .  ...  Also based on these probabilistic tools, the Bayesian Network Retrieval model (BNR) was introduced in [2] as an alternative to the existing methods based on Bayesian networks [7, 10, 14] .  ... 
doi:10.1007/3-540-44816-0_2 fatcat:qmzbbisn5zbong7gstv2iyasxe

Automated Learning of Interpretable Models with Quantified Uncertainty [article]

G.F. Bomarito and P.E. Leser and N.C.M Strauss and K.M. Garbrecht and J.D. Hochhalter
2022 arXiv   pre-print
Model parameter uncertainty is automatically quantified, enabling probabilistic predictions with each equation produced by the GPSR algorithm.  ...  A new Bayesian framework for genetic-programming-based symbolic regression (GPSR) is introduced that uses model evidence (i.e., marginal likelihood) to formulate replacement probability during the selection  ...  probabilistic models with quantified uncertainty.  ... 
arXiv:2205.01626v1 fatcat:ggxhsptf4zc23ayvvvgomladwi

Learning to Evolve Structural Ensembles of Unfolded and Disordered Proteins Using Experimental Solution Data [article]

Oufan Zhang, Mojtaba Haghighatlari, Jie Li, Joao Miguel Correia Teixeira, Ashley Namini, Zi-Hao Liu, Julie D Forman-Kay, Teresa Head-Gordon
2022 arXiv   pre-print
In addition, we couple the GRNN with a Bayesian model, X-EISD, in a reinforcement learning step that biases the probability distributions of torsions to take advantage of experimental data types such as  ...  We show that updating the generative model parameters according to the reward feedback on the basis of the agreement between structures and data improves upon existing approaches that simply reweight static  ...  the unbiased generative models are robust.  ... 
arXiv:2206.12667v3 fatcat:mba45f5pirhlfpnvaaacom5rr4

Practical probabilistic programming with monads

Adam Ścibior, Zoubin Ghahramani, Andrew D. Gordon
2015 Proceedings of the 8th ACM SIGPLAN Symposium on Haskell - Haskell 2015  
The machine learning community has recently shown a lot of interest in practical probabilistic programming systems that target the problem of Bayesian inference.  ...  We show that it is possible to use the monad abstraction for constructing probabilistic models, while still offering good performance of inference in challenging models.  ...  Discussions with Johannes Borgström, Ugo Dal Lago, Marcin Szymczak, and Ohad Kammar were helpful. David Tolpin and Brooks Paige helped with setting up Anglican and Probabilistic C, respectively.  ... 
doi:10.1145/2804302.2804317 dblp:conf/haskell/ScibiorGG15 fatcat:zpbxn22bmfg4xoqionwqyqy7d4
« Previous Showing results 1 — 15 out of 1,692 results