Filters








9,353 Hits in 4.6 sec

Improvement of ANN Models via Data Envelopment Analysis for Stock Prices Forecasting

Raja Das, Jitendra Kumar Jaiswal, . .
2018 International Journal of Engineering & Technology  
machine learning approaches such as support vector machines, particle sworm optimization, principal component analysis, etc.  ...  The efficient decision making units have been selected with help of DEA approach and provided it as input to the Lavenberg-Marquardt technique based ANN model in sliding window manner.  ...  If x * is local minimum of a quadratic function f, the necessary terms for optimality involve that H is positive semidefinite, and Hx * = b and the global minimizer is solution of this linear system in  ... 
doi:10.14419/ijet.v7i4.10.20913 fatcat:nqjlwcqq4bb4to7jryclqjlsxu

Shrinkage Fields for Effective Image Restoration

Uwe Schmidt, Stefan Roth
2014 2014 IEEE Conference on Computer Vision and Pattern Recognition  
Computationally expensive optimization is often the culprit. While efficient alternatives exist, they have not reached the same level of image quality.  ...  Unlike heavily engineered solutions, our learning approach can be adapted easily to different trade-offs between efficiency and image quality.  ...  In particular, shrinkage fields owe their computational efficiency to a specific kind of quadratic relaxation technique that is derived from the so-called additive form of halfquadratic optimization approaches  ... 
doi:10.1109/cvpr.2014.349 dblp:conf/cvpr/SchmidtR14 fatcat:hzsreumm6fdyxe463hc6p3hfee

On Learning High Dimensional Structured Single Index Models [article]

Nikhil Rao, Ravi Ganti, Laura Balzano, Rebecca Willett, Robert Nowak
2016 arXiv   pre-print
Single Index Models (SIMs) are simple yet flexible semi-parametric models for machine learning, where the response variable is modeled as a monotonic function of a linear combination of features.  ...  While methods have been described to learn SIMs in the low dimensional regime, a method that can efficiently learn SIMs in high dimensions, and under general structural assumptions, has not been forthcoming  ...  CSI interleaves parameter learning via iterative projected gradient descent and monotonic function learning via the LPAV algorithm.  ... 
arXiv:1603.03980v2 fatcat:m2y6hz2ghbgjxlvf7ng3e5ma74

Distribution-Independent Evolvability of Linear Threshold Functions [article]

Vitaly Feldman
2011 arXiv   pre-print
Our proof is based on a new combinatorial parameter of a concept class that lower-bounds the complexity of learning from correlations.  ...  Valiant's (2007) model of evolvability models the evolutionary process of acquiring useful functionality as a restricted form of learning from random examples.  ...  The inverse of the optimal margin is referred to as the margin complexity of a concept class [20] .  ... 
arXiv:1103.4904v1 fatcat:pnfkp5itdzcqdkcumkybjmu5ry

Opposition Based Teaching Learning Optimization For Combined Economic Emission Dispatch Problem

D. Sai Krishna Kanth, SS Deekshit, MG Mahesh
2019 International Journal of Advanced Scientific Technologies, Engineering and Management Sciences  
In this Project, a new algorithm opposition based teaching learning based optimization is developed and applied to ED problem.  ...  efficient mechanism to treat the constraints.  ...  Sudha Kiran, Assistant Professor, ECE Department, Annamacharya Institute of Technology and Sciences, Rajampet.  ... 
doi:10.22413/ijastems/2019/v5/i5/49610 fatcat:dao2kafl2vazzfohlslmuoqdwm

A complete characterization of statistical query learning with applications to evolvability

Vitaly Feldman
2012 Journal of computer and system sciences (Print)  
Unlike the previously known bounds on SQ learning (Blum, et al.]) our characterization preserves the accuracy and the efficiency of learning.  ...  We use this approach to demonstrate the existence of a large class of monotone evolution algorithms based on square loss performance estimation.  ...  I am also grateful to the anonymous reviewers of FOCS 2009 and JCSS for a number of insightful comments and useful corrections.  ... 
doi:10.1016/j.jcss.2011.12.024 fatcat:tlxm4yx6h5fjdcuc4epz3fqjw4

Understanding behaviors of a constructive memory agent: A markov chain analysis

John S. Gero, Wei Peng
2009 Knowledge-Based Systems  
This paper describes a Markov chain analysis of the behaviors of a constructive memory agent.  ...  It shows that a constructive memory agent behaves based on the knowledge structures that it has learned from its interaction with the environment.  ...  Knowledge construction behavior (K c ) is a special form of macro-behavior in which an agent learns new experience via the constructive learning function (C 2 ) of the conception 2 Note Rx is different  ... 
doi:10.1016/j.knosys.2009.05.006 fatcat:ldeftcnrhfarjjplka6axwqhnq

Page 5022 of Mathematical Reviews Vol. , Issue 96h [page]

1996 Mathematical Reviews  
previous papers.” 96h:68165 68T05 Bshouty, Nader H. (3-CALG-C; Calgary, AB) Exact learning Boolean functions via the monotone theory.  ...  Summary: “We present an algorithm for improving the accuracy of algorithms for learning binary concepts.  ... 

A Complete Characterization of Statistical Query Learning with Applications to Evolvability [article]

Vitaly Feldman
2013 arXiv   pre-print
Unlike the previously known bounds on SQ learning our characterization preserves the accuracy and the efficiency of learning.  ...  We use this approach to demonstrate the existence of a large class of monotone evolutionary learning algorithms based on square loss performance estimation.  ...  I am also grateful to the anonymous reviewers of FOCS 2009 and JCSS for a number of insightful comments and useful corrections.  ... 
arXiv:1002.3183v3 fatcat:k7zb4bpeurg5xofb3omlhw5yxy

A Complete Characterization of Statistical Query Learning with Applications to Evolvability

Vitaly Feldman
2009 2009 50th Annual IEEE Symposium on Foundations of Computer Science  
Unlike the previously known bounds on SQ learning [9, 11, 42, 3, 37] our characterization preserves the accuracy and the efficiency of learning.  ...  We use this approach to demonstrate the existence of a large class of monotone evolution algorithms based on square loss performance estimation.  ...  I am also grateful to the anonymous reviewers of FOCS 2009 and JCSS for a number of insightful comments and useful corrections.  ... 
doi:10.1109/focs.2009.35 dblp:conf/focs/Feldman09 fatcat:rawho4nhhfdjjnfqycrarmodnu

Toward efficient agnostic learning

Michael J. Kearns, Robert E. Schapire, Linda M. Sellie
1994 Machine Learning  
Our results include hardness results for the most obvious generalization of the PAC model to an agnostic setting, an efficient and general agnostic learning method based on dynamic programming, relationships  ...  We give a number of positive and negative results that provide an initial outline of the possibilities for agnostic learning.  ...  To the best of our knowledge and recollection, the term "agnostic learning" was coined during a discussion among Sally Goldman, Ron Rivest, and the first two authors of this paper. 2.  ... 
doi:10.1007/bf00993468 fatcat:hjznph6wgjfrveksobcrwvx5qe

Toward efficient agnostic learning

Michael J. Kearns, Robert E. Schapire, Linda M. Sellie
1992 Proceedings of the fifth annual workshop on Computational learning theory - COLT '92  
Our results include hardness results for the most obvious generalization of the PAC model to an agnostic setting, an efficient and general agnostic learning method based on dynamic programming, relationships  ...  We give a number of positive and negative results that provide an initial outline of the possibilities for agnostic learning.  ...  To the best of our knowledge and recollection, the term "agnostic learning" was coined during a discussion among Sally Goldman, Ron Rivest, and the first two authors of this paper. 2.  ... 
doi:10.1145/130385.130424 dblp:conf/colt/KearnsSS92 fatcat:ted7tyc22jabdpgxddat53nvuu

SAT Modulo Monotonic Theories [article]

Sam Bayless, Noah Bayless, Holger H. Hoos, Alan J. Hu
2014 arXiv   pre-print
We define the concept of a monotonic theory and show how to build efficient SMT (SAT Modulo Theory) solvers, including effective theory propagation and clause learning, for such theories.  ...  /min-cut, and then demonstrate our framework by building SMT solvers for each of these theories.  ...  To forestall confusion, note that our concept of a 'monotonic theory' here has no direct relationship to the concept of monotonic/non-monotonic logics.  ... 
arXiv:1406.0043v1 fatcat:otfsyrddavau3bneohxga3zwd4

Speeding-Up Convergence via Sequential Subspace Optimization: Current State and Future Directions [article]

Michael Zibulevsky
2013 arXiv   pre-print
One can also accelerate Augmented Lagrangian method for constrained optimization problems and Alternating Direction Method of Multipliers for problems with separable objective function and non-separable  ...  This is an overview paper written in style of research proposal.  ...  Quadratic function First, let us demonstrate "proof of the concept" using a pure quadratic function.  ... 
arXiv:1401.0159v1 fatcat:j4pysvjrqjfz7ozapzxpn7tj4a

Mean-Variance Efficient Reinforcement Learning by Expected Quadratic Utility Maximization [article]

Masahiro Kato and Kei Nakagawa and Kenshi Abe and Tetsuro Morimura
2021 arXiv   pre-print
In this paper, in contrast to strict MV control, we consider learning MV efficient policies that achieve Pareto efficiency regarding MV trade-off.  ...  To achieve this purpose, we train an agent to maximize the expected quadratic utility function, a common objective of risk management in finance and economics.  ...  a monotonous decreasing function on Var π θ (R); Mean-Variance Efficient Reinforcement Learning by Expected Quadratic Utility Maximization A PREPRINT 2.  ... 
arXiv:2010.01404v3 fatcat:ybrk4gtkevhjpdxzmt7lldkih4
« Previous Showing results 1 — 15 out of 9,353 results