A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
Filters
Continuous Population-Based Incremental Learning with Mixture Probability Modeling for Dynamic Optimization Problems
[chapter]
2014
Lecture Notes in Computer Science
This paper proposes a multimodal extension of PBILC based on Gaussian mixture models for solving dynamic optimization problems. ...
The result obtained in the experiments proved the efficiency of the approach in solving dynamic problems with a number of competing peaks. ...
This paper proposes a new algorithm based on PBIL C [11] , Multimodal Continuous Population-Based Incremental Learning (MPBIL C ), for approaching DOPs. ...
doi:10.1007/978-3-319-10840-7_55
fatcat:c2ypmjgiarft7fjhlgvtkhekbm
Population-Based Continuous Optimization, Probabilistic Modelling and Mean Shift
2005
Evolutionary Computation
This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and ...
This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering ...
This paper explores a theoretical foundation for continuous, population-based optimization using probabilistic modelling. ...
doi:10.1162/1063656053583478
pmid:15901425
fatcat:qfrjl2sw7rfqlgddfzk24jq7tq
A review on probabilistic graphical models in evolutionary computation
2012
Journal of Heuristics
Evolutionary algorithms is one such discipline that has employed probabilistic graphical models to improve the search for optimal solutions in complex problems. ...
Specifically, we give a survey of probabilistic model building-based evolutionary algorithms, called estimation of distribution algorithms, and compare different methods for probabilistic modeling in these ...
PBILcH
cGA
UMDAc
Population Based Incremental Learning
-
Univariate Marginal Distribution Algorithm
-
Continuous PBIL
^H
Compact Genetic Algorithm
-
Continuous UMDA
-
Bivariate
HEDA
MIMIC ...
doi:10.1007/s10732-012-9208-4
fatcat:54ipbzsryfbt5nqmaczgurb2he
An introduction and survey of estimation of distribution algorithms
2011
Swarm and Evolutionary Computation
This explicit use of probablistic models in optimization offers some significant advantages over other types of metaheuristics. ...
This explicit use of probablistic models in optimization offers some significant advantages over other types of metaheuristics. ...
One incremental univariate EDA is the population-based incremental learning (PBIL) (Baluja, 1994) algorithm, which works on binary strings. ...
doi:10.1016/j.swevo.2011.08.003
fatcat:dwuwfqma4zc5pesijpinqrpgdy
Robust graph SLAM in dynamic environments with moving landmarks
2015
2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
We evaluate the performance of existing robust SLAM algorithms as baselines, and validate the improvement of our new framework against datasets of dynamic environments with moving landmarks. ...
Existing Simultaneous Localization and Mapping (SLAM) algorithms face open challenges for navigation in complex dynamic environments due to presumptions of static environments or exceeding computational ...
Incremental Optimization We introduce the theoretical basis for an incremental implementation of the EM algorithm for the problem. ...
doi:10.1109/iros.2015.7353723
dblp:conf/iros/XiangRNJ15
fatcat:jna7r2b2jrbafogktig26ezhuq
On-line regression algorithms for learning mechanical models of robots: A survey
2011
Robotics and Autonomous Systems
To deal with these difficulties, endowing the controllers of the robots with the capability to learn a model of their kinematics and dynamics under changing circumstances is becoming mandatory. ...
With the emergence of more challenging contexts for robotics, the mechanical design of robots is becoming more and more complex. ...
Gaussian Mixture Models A mixture model is a probabilistic model for representing the presence of sub-populations within an overall population, without requiring that an observed data-set should identify ...
doi:10.1016/j.robot.2011.07.006
fatcat:mi46xi5l7bep5ijfzbdgiv7dqa
Scalable Transfer Evolutionary Optimization: Coping with Big Task Instances
[article]
2020
arXiv
pre-print
In today's digital world, we are confronted with an explosion of data and models produced and manipulated by numerous large-scale IoT/cloud-based applications. ...
We have conducted an extensive series of experiments across a set of practically motivated discrete and continuous optimization examples comprising a large number of source problem instances, of which ...
Basics of Model-Based Transfer The probabilistic model-based expression of a typical optimization problem T with a maximization function f (x x x) (we adopt −f (x x x) if the underlying optimization problem ...
arXiv:2012.01830v1
fatcat:2u4icstr7rd7jiqe77vpxv7rqm
A Dual-Purpose Memory Approach for Dynamic Particle Swarm Optimization of Recurrent Problems
[chapter]
2015
Studies in Computational Intelligence
In this paper, a memorybased Dynamic Particle Swarm Optimization (DPSO) approach which relies on a dual-purpose memory for fast optimization of streams of recurrent problems is proposed. ...
The dual-purpose memory is based on a Gaussian Mixture Model (GMM) of candidate solutions estimated in the optimization space which provides a compact representation of previously-found PSO solutions. ...
A novel memory-based Dynamic PSO (DPSO) technique has been proposed for fast optimization of recurring dynamic problems, where a two-level memory of selected solutions and Gaussian Mixture Models (GMM) ...
doi:10.1007/978-3-319-26450-9_14
fatcat:b2fpqhwfnza3fiestq2egv4uti
Fast intelligent watermarking of heterogeneous image streams through mixture modeling of PSO populations
2013
Applied Soft Computing
In this paper, we propose a dynamic particle swarm optimization (DPSO) technique which relies on a memory of Gaussian mixture models (GMMs) of solutions in the optimization space. ...
This memory of GMMs allows an accurate representation of the topology of a stream of optimization problems. ...
A mixture model consists of a linear combination of a limited (finite) number of models p(x| ) = K j=1˛j p(x|Â j ) (2) where p( x| ) is the probability density function (pdf) of a continuous random vector ...
doi:10.1016/j.asoc.2012.08.040
fatcat:22ozrxp74vd6dpuvpyu33bszam
Continual Prototype Evolution: Learning Online from Non-Stationary Data Streams
[article]
2021
arXiv
pre-print
As an additional contribution, we generalize the existing paradigms in continual learning to incorporate data incremental learning from data streams by formalizing a two-agent learner-evaluator framework ...
As a first, we introduce a system addressing both problems, where prototypes evolve continually in a shared latent space, enabling learning and prediction at any point in time. ...
Table 1 compares data incremental learning with online learning and the three main continual learning paradigms. ...
arXiv:2009.00919v4
fatcat:xcdrovmq7rgilf3hlin7j5tnqu
Model-Based Evolutionary Algorithms
2015
Proceedings of the Companion Publication of the 2015 on Genetic and Evolutionary Computation Conference - GECCO Companion '15
Acknowledgements ◮ Selected images were re-used from the 2012 GECCO tutorial "Probabilistic Model-building Genetic Algorithms" by Martin Pelikan. ...
Dynamically learned tree model superior to mirror structured models and to static tree model. Question: is there an optimal, predetermined linkage model that outperforms the learned (tree) model ? ...
with neighboring variable relations ◮ Model is a matrix with probabilities of edges ◮ Matrix needs to be adjusted while sampling ◮ For problems with neighboring relations works better than random keys ...
doi:10.1145/2739482.2756584
dblp:conf/gecco/ThierensB15
fatcat:m3vngle7ong4ngopximusovgoa
Model-based evolutionary algorithms
2014
Proceedings of the 2014 conference companion on Genetic and evolutionary computation companion - GECCO Comp '14
Acknowledgements ◮ Selected images were re-used from the 2012 GECCO tutorial "Probabilistic Model-building Genetic Algorithms" by Martin Pelikan. ...
Dynamically learned tree model superior to mirror structured models and to static tree model. Question: is there an optimal, predetermined linkage model that outperforms the learned (tree) model ? ...
with neighboring variable relations ◮ Model is a matrix with probabilities of edges ◮ Matrix needs to be adjusted while sampling ◮ For problems with neighboring relations works better than random keys ...
doi:10.1145/2598394.2605344
dblp:conf/gecco/ThierensB14
fatcat:ocjcmqpwfbb5ln6hadekywb2hy
Model-based evolutionary algorithms
2013
Proceeding of the fifteenth annual conference companion on Genetic and evolutionary computation conference companion - GECCO '13 Companion
Acknowledgements ◮ Selected images were re-used from the 2012 GECCO tutorial "Probabilistic Model-building Genetic Algorithms" by Martin Pelikan. ...
Dynamically learned tree model superior to mirror structured models and to static tree model. Question: is there an optimal, predetermined linkage model that outperforms the learned (tree) model ? ...
with neighboring variable relations ◮ Model is a matrix with probabilities of edges ◮ Matrix needs to be adjusted while sampling ◮ For problems with neighboring relations works better than random keys ...
doi:10.1145/2464576.2480801
dblp:conf/gecco/ThierensB13a
fatcat:5wn2n4yjo5fgvexfq5plx4kb5y
Model-Based Evolutionary Algorithms
2016
Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion - GECCO '16 Companion
Acknowledgements ◮ Selected images were re-used from the 2012 GECCO tutorial "Probabilistic Model-building Genetic Algorithms" by Martin Pelikan. ...
Dynamically learned tree model superior to mirror structured models and to static tree model. Question: is there an optimal, predetermined linkage model that outperforms the learned (tree) model ? ...
with neighboring variable relations ◮ Model is a matrix with probabilities of edges ◮ Matrix needs to be adjusted while sampling ◮ For problems with neighboring relations works better than random keys ...
doi:10.1145/2908961.2926975
dblp:conf/gecco/ThierensB16
fatcat:oiriel3cunbnhkqsy452yfypjq
Model-based evolutionary algorithms
2019
Proceedings of the Genetic and Evolutionary Computation Conference Companion on - GECCO '19
Acknowledgements ◮ Selected images were re-used from the 2012 GECCO tutorial "Probabilistic Model-building Genetic Algorithms" by Martin Pelikan. ...
with neighboring variable relations ◮ Model is a matrix with probabilities of edges ◮ Matrix needs to be adjusted while sampling ◮ For problems with neighboring relations works better than random keys ...
◮ Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA)
◮ For each solution in the population
◮ all subsets of the FOS are tried with a donor solution
randomly picked from the population
36/122 ...
doi:10.1145/3319619.3323386
dblp:conf/gecco/ThierensB19
fatcat:w6prjwu7wzfe3citnfna2sdcbe
« Previous
Showing results 1 — 15 out of 15,119 results