79,589 Hits in 4.3 sec

Optimizing nearest neighbour in random subspaces using a multi-objective genetic algorithm

G. Tremblay, R. Sabourin, P. Maupin
2004 Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004.  
plain stochastic search.  ...  Trying to optimize those ensembles based on the nearest neighbours and the random subspaces paradigms, we found that the use of a diversity metric called "ambiguity" had no better positive impact than  ...  The optimization is done using a multi-objective genetic algorithm and the NIST SD19 database [1] . The data consists of handwritten numerals forming a 10 classes pattern recognition problem.  ... 
doi:10.1109/icpr.2004.1334060 dblp:conf/icpr/TremblaySM04 fatcat:qds3drlfsrcyjbybmqtb2jgizq


Sinisa Todorovic, Rama Chellappa
2011 International Journal of Computer Vision  
While optimal object recognition using such a model is NP-complete, they provide mathematical evidence that part-based inference can achieve nearly optimal recognition within a feasible number of operations  ...  For recognition they use a Markov chain Monte Carlo sampling. • Kokkinos and Yuille in "Inference and Learning with Hierarchical Compositional Models" introduce a hierarchical representation for object  ... 
doi:10.1007/s11263-011-0420-8 fatcat:2puoqwob25dlbky7lqi275u5p4

Page 682 of Automation and Remote Control Vol. 36, Issue 4 [page]

1975 Automation and Remote Control  
models of learning in such problems as the control of random search, stochastic programming, control and identification of objects with incomplete information, recognition of statistical hypotheses, etc  ...  They obtained the existence and uniqueness conditions for the optimal solution, They considered the problem of using stochastic optimization methods for the numerical solution of the synthesis problem.  ... 

Amortized variance reduction for doubly stochastic objectives [article]

Ayman Boustati, Sattar Vakili, James Hensman, ST John
2020 arXiv   pre-print
We propose a new approach in which we use a recognition network to cheaply approximate the optimal control variate for each mini-batch, with no additional model gradient computations.  ...  Control variates can be used to reduce the variance, but past approaches do not take into account how mini-batch stochasticity affects sampling stochasticity, resulting in sub-optimal variance reduction  ...  In practice, such objectives are treated using Monte Carlo (MC) sampling to obtain an unbiased stochastic estimate * Work done while at  ... 
arXiv:2003.04125v1 fatcat:wqqkktmx7rapvilro4socxpywu

A critical evaluation of stochastic algorithms for convex optimization

Simon Wiesler, Alexander Richard, Ralf Schluter, Hermann Ney
2013 2013 IEEE International Conference on Acoustics, Speech and Signal Processing  
We obtained slight improvements by using a stochastic second order algorithm.  ...  Log-linear models find a wide range of applications in pattern recognition. The training of log-linear models is a convex optimization problem.  ...  Stochastic algorithms are widely used for training hierarchical models. For convex optimization, typically batch algorithms are employed.  ... 
doi:10.1109/icassp.2013.6639010 dblp:conf/icassp/WieslerRSN13 fatcat:yuv24pgmkbhwbhexbxmtjzv6ei

A Stochastic Search Algorithm to Optimize an N-tuple Classifier by Selecting Its Inputs [chapter]

Hannan Bin Azhar, Keith Dimond
2004 Lecture Notes in Computer Science  
The N-tuple method [4] is a statistical pattern recognition method, which decomposes a given pattern into several sets of n points, termed "N tuples".  ...  In case of stochastic optimization (Table 2 ) the average overall recognition was 83.67%, which was 2.74% superior to the random case.  ...  Tuple Search Algorithm Our objective was to find an optimal set of input connections for N-tuple classifier that will give higher recognition rate than the random case.  ... 
doi:10.1007/978-3-540-30125-7_69 fatcat:ebnli4wamnbehk5v25hfwrfmtu

Online trajectory optimization to improve object recognition

Christian Potthast, Gaurav S. Sukhatme
2016 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)  
We present an online trajectory optimization approach that optimizes a trajectory such that object recognition performance is improved.  ...  Inspired by prior work, we formulate the optimization as a derivative-free stochastic optimization, allowing us to express the cost function in an arbitrary way.  ...  Instead, we follow the proposed method of [3] and optimize Eq. 1 using a derivative-free stochastic optimization method.  ... 
doi:10.1109/iros.2016.7759700 dblp:conf/iros/PotthastS16 fatcat:ldq5jmgjg5e7neadkbupoghr44

Stochastic Goal Recognition Design

Christabel Wayllace
This document presents an extension of GRD to the stochastic domain: the Stochastic Goal Recognition Design (S-GRD).  ...  In this abstract we present the progress made towards the final objective as well as a timeline of projected conclusion.  ...  With this objective in mind we have proposed the Stochastic GRD (S-GRD) problem, where the outcomes of the agent's actions are stochastic (Wayllace et al. 2016) .  ... 
doi:10.1609/aaai.v33i01.33019904 fatcat:2ylhnocnp5d4tojxl4tx4sk2kq

Page 334 of Automation and Remote Control Vol. 33, Issue 2 [page]

1972 Automation and Remote Control  
Fitsner, “Control of objects having incomplete information by means of automatic search," considered the properties of systems for controlling static objects through use of an optimizer.  ...  For objects the control of which is found as a result of the solution of a problem in linear programming, it is proposed that use be made of an effective method of solution using a special feature of the  ... 

Optimal linear representations of images for object recognition

Xiuwen Liu, A. Srivastava, K. Gallivan
2004 IEEE Transactions on Pattern Analysis and Machine Intelligence  
This paper proposes a stochastic gradient algorithm for finding optimal linear representations of images, for use in appearance-based object recognition.  ...  For solving this optimization problem on a Grassmann manifold, a stochastic gradient algorithm utilizing intrinsic flows is introduced.  ...  In the context of object recognition using linear representations, we term our technique as optimal component analysis (OCA).  ... 
doi:10.1109/tpami.2004.1273986 pmid:15460288 fatcat:qicdjwntt5hpjaowhdvxsx7lxy

Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization [article]

Hyeonwoo Noh, Tackgeun You, Jonghwan Mun, Bohyung Han
2017 arXiv   pre-print
technique to achieve a tighter lower bound using multiple noise samples per training example in a stochastic gradient descent iteration.  ...  This paper addresses the above issues by 1) interpreting that the conventional training methods with regularization by noise injection optimize the lower bound of the true objective and 2) proposing a  ...  Object Recognition The proposed algorithm is integrated into wide residual network [40] , which uses dropout in every residual block, and evaluated on CIFAR datasets [19] .  ... 
arXiv:1710.05179v2 fatcat:k3d7dlszlrftjlmvrh73rpfdvu

Mean-normalized stochastic gradient for large-scale deep learning

Simon Wiesler, Alexander Richard, Ralf Schluter, Hermann Ney
2014 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
Deep neural networks are typically optimized with stochastic gradient descent (SGD). In this work, we propose a novel second-order stochastic optimization algorithm.  ...  Combining our proposed optimization algorithm with this model structure, model size can be reduced by a factor of eight and still improvements in recognition error rate are obtained.  ...  DNN training is a very difficult and highly non-convex optimization problem. The most widely used optimization algorithm used for DNN training is stochastic gradient descent (SGD).  ... 
doi:10.1109/icassp.2014.6853582 dblp:conf/icassp/WieslerRSN14 fatcat:memakhv6bbarjdmouxjazagodu

Page 526 of Automation and Remote Control Vol. 36, Issue 3 [page]

1975 Automation and Remote Control  
concerning the description of recognition systems were considered using the apparatus of Boolean al- gebra, The reports by B, Ya, Kovalerchuk (Novosibirsk), L, A, Rastrigin and R, Kh, Erenshtein, and  ...  Garusin, and G, D, Chervyshevaya (Voronezh, Moscow), interesting ideas were presented on the organization of stochastic-optimization algorithms, In a report by V, F, Korop and R, A, Butman (Khar'kov)  ... 

Application of Human Cognitive Self Regulating Particle Swarm Optimization and Stochastic Resonance in Bearing Fault Diagnosis

Chao ZHANG, Yuan-yuan HE, Jian-guo WANG, Teng-fei ZHU
2018 DEStech Transactions on Computer Science and Engineering  
Then the self regulating particle swarm optimization algorithm is used to optimize the objective function, and the optimized optimal values a and b are substituted into the system of variable scale stochastic  ...  The algorithm first sets up the objective function according to the stochastic resonance structural parameters a and b.  ...  With the rapid development of science and technology, PSO has been widely used in image processing, pattern recognition and optimization.  ... 
doi:10.12783/dtcse/cmee2017/20048 fatcat:rkn25aq5vbhhpfab7zld3iywym

Stochastic lexicon modeling for speech recognition

Seong-Jin Yun, Yung-Hwan Oh
1999 IEEE Signal Processing Letters  
To optimally cope with continuous speech recognizer, we propose the stochastic lexicon model that effectively represents variations in pronunciation.  ...  Also, the proposed approach can be applied to systems employing nonlinguistic recognition units.  ...  This is usually done either by applying the phonological rules to a given lexicon or using a datadriven method from speech data with an objective optimization criterion, like The aim of the proposed lexicon  ... 
doi:10.1109/97.739004 fatcat:2jm5jcc4yrfqrb5nuslsvhwfka
« Previous Showing results 1 — 15 out of 79,589 results