A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is
plain stochastic search. ... Trying to optimize those ensembles based on the nearest neighbours and the random subspaces paradigms, we found that the use of a diversity metric called "ambiguity" had no better positive impact than ... The optimization is done using a multi-objective genetic algorithm and the NIST SD19 database  . The data consists of handwritten numerals forming a 10 classes pattern recognition problem. ...doi:10.1109/icpr.2004.1334060 dblp:conf/icpr/TremblaySM04 fatcat:qds3drlfsrcyjbybmqtb2jgizq
While optimal object recognition using such a model is NP-complete, they provide mathematical evidence that part-based inference can achieve nearly optimal recognition within a feasible number of operations ... For recognition they use a Markov chain Monte Carlo sampling. • Kokkinos and Yuille in "Inference and Learning with Hierarchical Compositional Models" introduce a hierarchical representation for object ...doi:10.1007/s11263-011-0420-8 fatcat:2puoqwob25dlbky7lqi275u5p4
Automation and Remote Control
models of learning in such problems as the control of random search, stochastic programming, control and identification of objects with incomplete information, recognition of statistical hypotheses, etc ... They obtained the existence and uniqueness conditions for the optimal solution, They considered the problem of using stochastic optimization methods for the numerical solution of the synthesis problem. ...
We propose a new approach in which we use a recognition network to cheaply approximate the optimal control variate for each mini-batch, with no additional model gradient computations. ... Control variates can be used to reduce the variance, but past approaches do not take into account how mini-batch stochasticity affects sampling stochasticity, resulting in sub-optimal variance reduction ... In practice, such objectives are treated using Monte Carlo (MC) sampling to obtain an unbiased stochastic estimate * Work done while at Prowler.io. ...arXiv:2003.04125v1 fatcat:wqqkktmx7rapvilro4socxpywu
We obtained slight improvements by using a stochastic second order algorithm. ... Log-linear models find a wide range of applications in pattern recognition. The training of log-linear models is a convex optimization problem. ... Stochastic algorithms are widely used for training hierarchical models. For convex optimization, typically batch algorithms are employed. ...doi:10.1109/icassp.2013.6639010 dblp:conf/icassp/WieslerRSN13 fatcat:yuv24pgmkbhwbhexbxmtjzv6ei
Lecture Notes in Computer Science
The N-tuple method  is a statistical pattern recognition method, which decomposes a given pattern into several sets of n points, termed "N tuples". ... In case of stochastic optimization (Table 2 ) the average overall recognition was 83.67%, which was 2.74% superior to the random case. ... Tuple Search Algorithm Our objective was to find an optimal set of input connections for N-tuple classifier that will give higher recognition rate than the random case. ...doi:10.1007/978-3-540-30125-7_69 fatcat:ebnli4wamnbehk5v25hfwrfmtu
We present an online trajectory optimization approach that optimizes a trajectory such that object recognition performance is improved. ... Inspired by prior work, we formulate the optimization as a derivative-free stochastic optimization, allowing us to express the cost function in an arbitrary way. ... Instead, we follow the proposed method of  and optimize Eq. 1 using a derivative-free stochastic optimization method. ...doi:10.1109/iros.2016.7759700 dblp:conf/iros/PotthastS16 fatcat:ldq5jmgjg5e7neadkbupoghr44
This document presents an extension of GRD to the stochastic domain: the Stochastic Goal Recognition Design (S-GRD). ... In this abstract we present the progress made towards the final objective as well as a timeline of projected conclusion. ... With this objective in mind we have proposed the Stochastic GRD (S-GRD) problem, where the outcomes of the agent's actions are stochastic (Wayllace et al. 2016) . ...doi:10.1609/aaai.v33i01.33019904 fatcat:2ylhnocnp5d4tojxl4tx4sk2kq
Automation and Remote Control
Fitsner, “Control of objects having incomplete information by means of automatic search," considered the properties of systems for controlling static objects through use of an optimizer. ... For objects the control of which is found as a result of the solution of a problem in linear programming, it is proposed that use be made of an effective method of solution using a special feature of the ...
This paper proposes a stochastic gradient algorithm for finding optimal linear representations of images, for use in appearance-based object recognition. ... For solving this optimization problem on a Grassmann manifold, a stochastic gradient algorithm utilizing intrinsic flows is introduced. ... In the context of object recognition using linear representations, we term our technique as optimal component analysis (OCA). ...doi:10.1109/tpami.2004.1273986 pmid:15460288 fatcat:qicdjwntt5hpjaowhdvxsx7lxy
technique to achieve a tighter lower bound using multiple noise samples per training example in a stochastic gradient descent iteration. ... This paper addresses the above issues by 1) interpreting that the conventional training methods with regularization by noise injection optimize the lower bound of the true objective and 2) proposing a ... Object Recognition The proposed algorithm is integrated into wide residual network  , which uses dropout in every residual block, and evaluated on CIFAR datasets  . ...arXiv:1710.05179v2 fatcat:k3d7dlszlrftjlmvrh73rpfdvu
Deep neural networks are typically optimized with stochastic gradient descent (SGD). In this work, we propose a novel second-order stochastic optimization algorithm. ... Combining our proposed optimization algorithm with this model structure, model size can be reduced by a factor of eight and still improvements in recognition error rate are obtained. ... DNN training is a very difficult and highly non-convex optimization problem. The most widely used optimization algorithm used for DNN training is stochastic gradient descent (SGD). ...doi:10.1109/icassp.2014.6853582 dblp:conf/icassp/WieslerRSN14 fatcat:memakhv6bbarjdmouxjazagodu
Automation and Remote Control
concerning the description of recognition systems were considered using the apparatus of Boolean al- gebra, The reports by B, Ya, Kovalerchuk (Novosibirsk), L, A, Rastrigin and R, Kh, Erenshtein, and ... Garusin, and G, D, Chervyshevaya (Voronezh, Moscow), interesting ideas were presented on the organization of stochastic-optimization algorithms, In a report by V, F, Korop and R, A, Butman (Khar'kov) ...
Then the self regulating particle swarm optimization algorithm is used to optimize the objective function, and the optimized optimal values a and b are substituted into the system of variable scale stochastic ... The algorithm first sets up the objective function according to the stochastic resonance structural parameters a and b. ... With the rapid development of science and technology, PSO has been widely used in image processing, pattern recognition and optimization. ...doi:10.12783/dtcse/cmee2017/20048 fatcat:rkn25aq5vbhhpfab7zld3iywym
To optimally cope with continuous speech recognizer, we propose the stochastic lexicon model that effectively represents variations in pronunciation. ... Also, the proposed approach can be applied to systems employing nonlinguistic recognition units. ... This is usually done either by applying the phonological rules to a given lexicon or using a datadriven method from speech data with an objective optimization criterion, like The aim of the proposed lexicon ...doi:10.1109/97.739004 fatcat:2jm5jcc4yrfqrb5nuslsvhwfka
« Previous Showing results 1 — 15 out of 79,589 results