Filters








5,983 Hits in 8.5 sec

Noisy-Input Entropy Search for Efficient Robust Bayesian Optimization [article]

Lukas P. Fröhlich, Edgar D. Klenske, Julia Vinogradska, Christian Daniel, Melanie N. Zeilinger
2020 arXiv   pre-print
In this paper, we propose Noisy-Input Entropy Search (NES), a novel information-theoretic acquisition function that is designed to find robust optima for problems with both input and measurement noise.  ...  We consider the problem of robust optimization within the well-established Bayesian optimization (BO) framework.  ...  Noisy-Input Entropy Search In this section, we elaborate on the main contribution of this paper.  ... 
arXiv:2002.02820v1 fatcat:ix4mwuga2bffdov4e5shvxy33q

Recent Advances in Bayesian Optimization [article]

Xilu Wang, Yaochu Jin, Sebastian Schmitt, Markus Olhofer
2022 arXiv   pre-print
Bayesian optimization has emerged at the forefront of expensive black-box optimization due to its data efficiency.  ...  Recent years have witnessed a proliferation of studies on the development of new Bayesian optimization algorithms and their applications.  ...  More recently, such a robust Bayesian optimization setting has been studied by Fröhlich et al. [55] , where a noisy-input entropy search (NES) based on MES is proposed.  ... 
arXiv:2206.03301v1 fatcat:d4mlbxwdjvad5jbzmsskp44dxq

Joint Entropy Search For Maximally-Informed Bayesian Optimization [article]

Carl Hvarfner and Frank Hutter and Luigi Nardi
2022 arXiv   pre-print
Entropy Search and Predictive Entropy Search both consider the entropy over the optimum in the input space, while the recent Max-value Entropy Search considers the entropy over the optimal value in the  ...  Information-theoretic Bayesian optimization techniques have become popular for optimizing expensive-to-evaluate black-box functions due to their non-myopic qualities.  ...  Joint Entropy Search We now present Joint Entropy Search (JES), a novel information-theoretic approach for Bayesian optimization.  ... 
arXiv:2206.04771v2 fatcat:y2fx2b3b5vhhdd2x43vdecbyeu

An Entropy Search Portfolio for Bayesian Optimization [article]

Bobak Shahriari and Ziyu Wang and Matthew W. Hoffman and Alexandre Bouchard-Côté and Nando de Freitas
2015 arXiv   pre-print
Bayesian optimization is a sample-efficient method for black-box global optimization.  ...  To address this issue, we introduce the Entropy Search Portfolio (ESP): a novel approach to portfolio construction which is motivated by information theoretic considerations.  ...  Instead, in the next section we will describe Algorithm 1 Bayesian Optimization with Portfolios Input: a noisy BLACKBOX function, initial data D 0 = {} a portfolio A = {α k } 1: for t = 1, . . . , T do  ... 
arXiv:1406.4625v4 fatcat:ipmqddyiu5bo5f25lydpdrhhyi

Doubly Bayesian Optimization [article]

Alexander Lavin
2019 arXiv   pre-print
Probabilistic programming systems enable users to encode model structure and naturally reason about uncertainties, which can be leveraged towards improved Bayesian optimization (BO) methods.  ...  Here we present a probabilistic program embedding of BO that is capable of addressing main issues such as problematic domains (noisy, non-smooth, high-dimensional) and the neglected inner-optimization.  ...  Figure 4 : Reproducing the unscented Bayesian optimization (UBO) method for noisy optimization.  ... 
arXiv:1812.04562v4 fatcat:jaatj5drsbe5njwpl3fhs7tjeq

Computation noise promotes cognitive resilience to adverse conditions during decision-making [article]

Charles Findling, Valentin Wyart
2020 bioRxiv   pre-print
these computations for maximizing rewards.  ...  In contrast to artificial agents with exact computations, noisy agents exhibit hallmarks of Bayesian inference acquired in a 'zero-shot' fashion - without prior experience with conditions that require  ...  Input noise (e.g., sensory noise) is widely seen as a hard constraint that neural networks have to cope with using efficient mechanisms 21 : 'population' coding to average out input noise 22 , or 'efficient  ... 
doi:10.1101/2020.06.10.145300 fatcat:abhslgtm35abnnu64qnvrx636m

Regret-Aware Black-Box Optimization with Natural Gradients, Trust-Regions and Entropy Control [article]

Maximilian Hüttenrauch, Gerhard Neumann
2022 arXiv   pre-print
In contrast, stochastic optimizers that are motivated by policy gradients, such as the Model-based Relative Entropy Stochastic Search (MORE) algorithm, directly optimize the expected fitness function without  ...  Lastly, noisy fitness function evaluations may result in solutions that are highly sub-optimal on expectation.  ...  First, it should be data-and time-efficient to allow for quick execution of the algorithm. Additionally, it should be robust towards outliers and samples from very low entropy regimes.  ... 
arXiv:2206.06090v1 fatcat:awvdtyldjnedncp4kviefvl5o4

Efficient Bayesian Optimization using Multiscale Graph Correlation [article]

Takuya Kanazawa
2021 arXiv   pre-print
Bayesian optimization is a powerful tool to optimize a black-box function, the evaluation of which is time-consuming or costly.  ...  search and GP-UCB.  ...  The two major advantages of Bayesian optimization are sample efficiency and robustness under noisy observations.  ... 
arXiv:2103.09434v1 fatcat:6goamdjgwngwhczhp7gqjl7sju

Local Bayesian Optimization of Motor Skills

Riad Akrour, Dmitry Sorokin, Jan Peters, Gerhard Neumann
2017 International Conference on Machine Learning  
Bayesian optimization is renowned for its sample efficiency but its application to higher dimensional tasks is impeded by its focus on global optimization.  ...  To scale to higher dimensional problems, we leverage the sample efficiency of Bayesian optimization in a local context.  ...  When the evaluations are noisy the minimum evaluation is not a robust performance metricnor an appropriate criterion for the algorithm to select the returned optimizer.  ... 
dblp:conf/icml/AkrourS0N17 fatcat:vwemwcqspbdavncri6iap7n54e

Data-Efficient Autotuning With Bayesian Optimization: An Industrial Control Study

Matthias Neumann-Brosig, Alonso Marco, Dieter Schwarzmann, Sebastian Trimpe
2019 IEEE Transactions on Control Systems Technology  
Bayesian optimization is proposed for automatic learning of optimal controller parameters from experimental data.  ...  In order to learn fast, the Bayesian optimization algorithm selects the next parameters to evaluate in a systematic way, for example, by maximizing information gain about the optimum.  ...  ACKNOWLEDGMENT The authors would like to thank Manus Thiel (IAV GmbH) for his help while setting up the experiments.  ... 
doi:10.1109/tcst.2018.2886159 fatcat:ikn3eio2rzhxpgoax4hzjyjuka

Robust decompositions of quantum states [article]

Jonathan E. Moussa
2020 arXiv   pre-print
important in our present era of noisy intermediate-scale quantum devices for framing near-term progress towards quantum supremacy.  ...  We establish one such equivalence using a noisy quantum circuit model that can be simulated efficiently on classical computers.  ...  functions with noisy inputs [25] and learning parity from an oracle with noisy outputs [26] .  ... 
arXiv:2003.04171v1 fatcat:xbmzstvp2rbsfmla6x3ggt4xma

Constrained Bayesian Optimization with Noisy Experiments [article]

Benjamin Letham, Brian Karrer, Guilherme Ottoni, Eytan Bakshy
2018 arXiv   pre-print
Bayesian optimization is a promising technique for efficiently optimizing multiple continuous parameters, but existing approaches degrade in performance when the noise level is high, limiting its applicability  ...  We derive an expression for expected improvement under greedy batch optimization with noisy observations and noisy constraints, and develop a quasi-Monte Carlo approximation that allows it to be efficiently  ...  ., 2009) , entropy search (Hennig and Schuler, 2012) , and predictive entropy search (PES) (Hernández-Lobato et al., 2014) .  ... 
arXiv:1706.07094v2 fatcat:smtet7z7wnbqbngle7pv6qkjrq

Max-value Entropy Search for Efficient Bayesian Optimization [article]

Zi Wang, Stefanie Jegelka
2018 arXiv   pre-print
Entropy Search (ES) and Predictive Entropy Search (PES) are popular and empirically successful Bayesian Optimization techniques.  ...  In particular, MES is much more robust to the number of samples used for computing the entropy, and hence more efficient for higher dimensional problems.  ...  Tomás Lozano-Pérez for discussions on active learning and Dr.  ... 
arXiv:1703.01968v3 fatcat:vx6gwj4bh5ecbjamsc6smc7hqi

Incorporating Expert Prior Knowledge into Experimental Design via Posterior Sampling [article]

Cheng Li, Sunil Gupta, Santu Rana, Vu Nguyen, Antonio Robles-Kelly, Svetha Venkatesh
2020 arXiv   pre-print
In this paper, we adopt the technique of Bayesian optimization for experimental design since Bayesian optimization has established itself as an efficient tool for optimizing expensive black-box functions  ...  An efficient Bayesian optimization approach has been proposed via posterior sampling on the posterior distribution of the global optimum.  ...  Bayesian optimization (BO) is well known to be an efficient method for optimizing expensive black-box functions [5] and shows competitive performance in broad applications, such as material search [  ... 
arXiv:2002.11256v1 fatcat:lcp7quchn5g6hb2jb5jva3kwea

Online Label Aggregation: A Variational Bayesian Approach [article]

Chi Hong, Amirmasoud Ghiassi, Yichi Zhou, Robert Birke, Lydia Y. Chen
2020 arXiv   pre-print
In this paper, we propose a novel online label aggregation framework, BiLA, which employs variational Bayesian inference method and designs a novel stochastic optimization scheme for incremental training  ...  Noisy labeled data is more a norm than a rarity for crowd sourced contents. It is effective to distill noise and infer correct labels through aggregation results from crowd workers.  ...  The core components of BiLA are variational Bayesian inference model and a stochastic optimizer for incrementally training on online data subset.  ... 
arXiv:1807.07291v2 fatcat:o3tnrc7vxvbg5eyh4mrgze4n5i
« Previous Showing results 1 — 15 out of 5,983 results