A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is `application/pdf`

.

## Filters

##
###
Applications of Regularized Least Squares to Classification Problems
[chapter]

2004
*
Lecture Notes in Computer Science
*

We present a survey

doi:10.1007/978-3-540-30215-5_2
fatcat:r4fq2hmcm5gilcj4qgvcvayfaq
*of*recent results concerning the theoretical and empirical performance*of*algorithms for learning*regularized**least*-*squares*classifiers. ... The behavior*of*these family*of*learning algorithms is analyzed in both the statistical and the worst-case (individual sequence) data-generating models. ...*Regularized**Least*-*Squares*for*Classification*In the pattern*classification**problem*, some unknown source is supposed*to*generate a sequence x 1 , x 2 , . . .*of*instances (data elements) x t ∈ X , where ...##
###
Multi-class least squares classification at binary-classification complexity

2011
*
2011 IEEE Statistical Signal Processing Workshop (SSP)
*

This paper deals with multi-class

doi:10.1109/ssp.2011.5967680
fatcat:udptydqw2veaxjmbbmwr6yknpa
*classification**problems*. Many methods extend binary classifiers*to*operate a multiclass task, with strategies such as the one-vs-one and the onevs-all schemes. ... We present a method for multi-class*classification*, with a computational complexity essentially independent*of*the number*of*classes. ... MULTI-CLASS*LEAST**SQUARES**CLASSIFICATION*In a multi-class*classification**problem*, we consider a set*of*N training data, belonging*to*any*of*the m available classes. ...##
###
A MODIFIED LEAST SQUARES SUPPORT VECTOR MACHINE CLASSIFIER WITH APPLICATION TO CREDIT RISK ANALYSIS

2009
*
International Journal of Information Technology and Decision Making
*

in

doi:10.1142/s0219622009003600
fatcat:abcxc6ma5bctpppamlfp4uwp7y
*classification**of*important classes, than*to*errors in*classification**of*unimportant classes, while keeping the*regularized*terms in their original form. ... The C-VLSSVM classifier can be obtained by a simple modification*of*the*regularization*parameter, based on the*least**squares*support vector machine (LSSVM) classifier, whereby more weight is given*to*errors ... Acknowledgements The authors would like*to*thank the anonymous referees for their valuable comments and suggestions. Their comments helped*to*improve the quality*of*the paper immensely. ...##
###
A General Framework for Sparsity Regularized Feature Selection via Iteratively Reweighted Least Square Minimization

2017
*
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
*

A variety

doi:10.1609/aaai.v31i1.10833
fatcat:iiqpd3lew5dx3jedc3bnikga4a
*of*feature selection methods based on sparsity*regularization*have been developed with different loss functions and sparse*regularization*functions. ... a ℓ2,p-norm (0 < p ≤ 2) sparse*regularization*. ... Acknowledgments This study was supported in part by National Key Basic Research and Development Program*of*China (2015CB856404), National Natural Science Foundation*of*China (81271514, 61473296), and NIH ...##
###
Least Squares Auto-Tuning
[article]

2019
*
arXiv
*
pre-print

We apply our method, which we call

arXiv:1904.05460v1
fatcat:5wfw23ljbnczxgitxnz4kil6xi
*least**squares*auto-tuning,*to*data fitting. ...*Least**squares*is by far the simplest and most commonly applied computational method in many fields. In almost all*applications*, the*least**squares*objective is rarely the true objective. ... The final contribution is our unique*application**of**least**squares*auto-tuning*to*data fitting. ...##
###
Page 1630 of Mathematical Reviews Vol. , Issue 83d
[page]

1983
*
Mathematical Reviews
*

An

*application**of*these methods*to*the solution*of*sparse*square*nonsymmetric linear systems is also presented.” ... This*problem*, while appearing*to*be quite special, is the core*problem*arising in the solution*of*the general linearly constrained linear*least**squares**problem*. ...##
###
10.5937/sjm9-5520 = On robust information extraction from high-dimensional data

2014
*
Serbian Journal of Management
*

In general, (4) can be described as a

doi:10.5937/sjm9-5520
fatcat:vuzzfbopnnchddwxslzla3xxw4
*regularized*version*of*the*least**squares*estimator. ... A requirement*to*reduce the influence*of*noise on the computed solution leads*to*a modification*of*the*least**squares*method, most commonly by the Tikhonov*regularization*. ...##
###
Pressure vessel state investigation based upon the least squares support vector machine

2011
*
Mathematical and computer modelling
*

In view

doi:10.1016/j.mcm.2010.11.011
fatcat:jhqe5ur3ujdzdnwi3jxvccffam
*of*the remarkable time-frequency property obtained from wavelet packets and the excellent generalization ability derived from the*least**squares*support vector machine (LS SVM), a novel approach ... In addition, the LS SVM is introduced*to*accomplish*classification*, for judging the states*of*pressure vessels. ... Just due*to*this consideration, in 1999, J.A.K. Suykens put forward the*least**squares*support vector machine (LS SVM) based upon the standard SVM [2] . ...##
###
Hypergraph spectral learning for multi-label classification

2008
*
Proceeding of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD 08
*

*To*reduce the computational cost, we propose an approximate formulation, which is shown

*to*be equivalent

*to*a

*least*

*squares*

*problem*under a mild condition. ... Based on the approximate formulation, efficient algorithms for solving

*least*

*squares*

*problems*can be applied

*to*scale the formulation

*to*very large data sets. ... Such type

*of*

*problems*occurs in many important

*applications*, such as protein function

*classification*[9] , text categorization [25] , and semantic scene

*classification*[4] . ...

##
###
Polynomial Runtime Bounds for Fixed-Rank Unsupervised Least-Squares Classification

2013
*
Asian Conference on Machine Learning
*

In this work, we consider one

dblp:conf/acml/GiesekePI13
fatcat:s3dgarvluvfezehomdvggsf54u
*of*these variants, called unsupervised*regularized**least*-*squares**classification*, which is based on the*square*loss, and develop polynomial upper runtime bounds for the induced ... The goal is*to*partition unlabeled data into two classes such that a subsequent*application**of*a support vector machine would yield the overall best result (with respect*to*the optimization*problem*associated ... Acknowledgements We would like*to*thank the anonymous reviewers for their detailed comments. ...##
###
A scalable two-stage approach for a class of dimensionality reduction techniques

2010
*
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '10
*

In this paper, an efficient two-stage approach is proposed

doi:10.1145/1835804.1835846
dblp:conf/kdd/SunCY10
fatcat:dmlhwdw5hnbttamxy6m2ekgtqy
*to*solve a class*of*dimensionality reduction techniques, including Canonical Correlation Analysis, Orthonormal Partial*Least**Squares*, Linear Discriminant ... Prior work transforms the generalized eigenvalue*problem*into an equivalent*least**squares*formulation, which can then be solved efficiently. ... Acknowledgements This work was supported by NSF IIS-0612069, IIS-0812551, IIS-0953662, NGA HM1582-08-1-0016, the Office*of*the Director*of*National Intelligence (ODNI), Intelligence Advanced Research Projects ...##
###
Sparse LS-SVMs using additive regularization with a penalized validation criterion

2004
*
The European Symposium on Artificial Neural Networks
*

*Least*

*Squares*Support Vector Machines (LS-SVMs) [9, 10] are reformulations

*to*standard SVMs which lead

*to*solving linear KKT systems for

*classification*tasks as well as regression and primaldual LS-SVM ... This paper is based on a new way for determining the

*regularization*trade-off in

*least*

*squares*support vector machines (LS-SVMs) via a mechanism

*of*additive

*regularization*which has been recently introduced ... The criterion (11) leads

*to*the unique solution with the following constrained

*least*

*squares*

*problem*1 N Ω 1 N Ω v b α − y y v 2 2 s.t. 1 T N α = 0. ( 12 ) Straightforward

*application*

*of*the criterion ...

##
###
Comprehensive Review On Twin Support Vector Machines
[article]

2021
*
arXiv
*
pre-print

*to*solve two SVM kind

*problems*. ...

*To*begin with we first introduce the basic theory

*of*support vector machine, TWSVM and then focus on the various improvements and

*applications*

*of*TWSVM, and then we introduce TSVR and its various enhancements ... [261] formulated a sparse version

*of*

*least*

*square*TSVR by introducing a

*regularization*term

*to*make it strongly convex and also converted the primal

*problems*

*to*linear programming

*problems*. ...

##
###
Numerical analysis of least squares and perceptron learning for classification problems
[article]

2020
*
arXiv
*
pre-print

This work presents study on

arXiv:2004.01138v4
fatcat:wgxvhr2qgngplimly2byp5qk3q
*regularized*and non-*regularized*versions*of*perceptron learning and*least**squares*algorithms for*classification**problems*. ... Fr'echet derivatives for*regularized**least**squares*and perceptron learning algorithms are derived. ... Non-*regularized**least**squares**problem*In non-*regularized*linear regression or*least**squares**problem*the goal is*to*minimize the sum*of**squares*E(ω) = 1 2 N ∑ n=1 (t n − f (x, ω)) 2 = 1 2 N ∑ n=1 (t n − ...##
###
Regularized logistic regression method for change detection in multispectral data via Pathwise Coordinate optimization

2010
*
2010 IEEE International Conference on Image Processing
*

Change detection methods base on

doi:10.1109/icip.2010.5654271
dblp:conf/icip/LiQJ10
fatcat:rmrymngdrvc7tnjqkl2foogbxu
*classification*schemes under this kind*of*condition should put more emphasis on the model's simplicity and efficiency in addition*to*the detection accuracy. ... When applied on the L1-*regularized*regression*problem*, the algorithm can handle large*problems*in a comparatively very low timing cost. ... Then we use coordinate descent*to*solve the penalized weighted*least*-*squares**problem*min ( 0, )∈ℝ +1 {−ℓ ( 0 , ) + ( )} (12) All*of*the above can be formulated*to*a sequence*of*nested loops: • OUTER LOOP ...
« Previous

*Showing results 1 — 15 out of 178,976 results*