A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
The group lasso for logistic regression
2008
Journal of The Royal Statistical Society Series B-statistical Methodology
The group lasso estimator for logistic regression is shown to be statistically consistent even if the number of predictors is much larger than sample size but with sparse true underlying structure. ...
We extend the group lasso to logistic regression models and present an efficient algorithm, that is especially suitable for high dimensional problems, which can also be applied to generalized linear models ...
Acknowledgements We thank two referees and Sylvain Sardy for constructive comments. ...
doi:10.1111/j.1467-9868.2007.00627.x
fatcat:lhau2op7tzce7lbw4zxjrpmcjm
Comparison between Common Statistical Modeling Techniques Used in Research, Including: Discriminant Analysis vs Logistic Regression, Ridge Regression vs LASSO, and Decision Tree vs Random Forest
2022
OALib
This paper compares common statistical approaches, including regression vs classification, discriminant analysis vs logistic regression, ridge regression vs LASSO, and decision tree vs random forest. ...
Ridge Regression vs LASSO Ridge regression and lasso are both regularization (shrinkage) methods. LASSO regression stands for "Least Absolute Shrinkage and Selection Operator". ...
Lasso is another extension built on regularized linear regression, but with a small twist. ...
doi:10.4236/oalib.1108414
fatcat:6hqkvmped5f5pmoogu4umv47vm
Adaptive group-regularized logistic elastic net regression
[article]
2018
arXiv
pre-print
As a solution to this problem, we propose a group-regularized (logistic) elastic net regression method, where each penalty parameter corresponds to a group of features based on the external information ...
The method, termed gren, makes use of the Bayesian formulation of logistic elastic net regression to estimate both the model and penalty parameters in an approximate empirical-variational Bayes framework ...
Note that if m i = 1 for i = 1, . . . , n, the model reduces to a binary logistic regression model. ...
arXiv:1805.00389v1
fatcat:b2pimzhq4bbjhaxdhlm5gnq6z4
A More Accurate Estimation of Semiparametric Logistic Regression
2021
Mathematics
Specifically, in this paper, based on the regularization method and an innovative class of garrotized kernel functions, we propose a novel penalized kernel machine method for a semiparametric logistic ...
Thus, our task is to construct a method which can guarantee the estimation accuracy by removing redundant variables. ...
In addition, for comparison, we also apply the LSKM method based on Gaussian kernel and the LASSO method for logistic regression to this adjusted breast cancer data. ...
doi:10.3390/math9192376
fatcat:r3vvs6ridvghtpfipfyh57bz3u
Sparse Logistic Regression with Lp Penalty for Biomarker Identification
2007
Statistical Applications in Genetics and Molecular Biology
In this paper, we propose a novel method for sparse logistic regression with non-convex regularization Lp (p <1). ...
Biomarkers identified with our methods are compared with that in the literature. Our computational results show that Lp Logistic regression (p <1) outperforms the L1 logistic regression and SCAD SVM. ...
Theoretically, when sample size n m, logistic regression with L p (p < 1) gives asymptotically unbiased estimates of the nonzero parameters while shrinking the estimates of zero (or small) parameters to ...
doi:10.2202/1544-6115.1248
pmid:17402921
fatcat:sbdavwwabravxjdmndfn643jr4
Regularized (bridge) logistic regression for variable selection based on ROC criterion
2009
Statistics and its Interface
Despite advances in the last several decades in developing such regularized regression models, issues regarding the choice of penalty parameter and the computational methods for models fitting with parameter ...
It is well known that the bridge regression (with tuning parameter less or equal to 1) gives asymptotically unbiased estimates of the nonzero regression parameters while shrinking smaller regression parameters ...
The authors thank the editor and the referees for their constructive comments.
Received 11 December 2008 ...
doi:10.4310/sii.2009.v2.n4.a10
fatcat:igtprm53zff3lm6ewk46hb3yfi
Geographically weighted elastic net logistic regression
2018
Journal of Geographical Systems
The approach is compared with other logistic regressions. ...
Model comparisons show that standard geographically weighted logistic regression over-estimated relationship non-stationarity because it fails to adequately deal with collinearity and model selection. ...
the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. ...
doi:10.1007/s10109-018-0280-7
fatcat:kh3ubdipvzasdj3r3oxferpltm
Distributed Parallel Sparse Multinomial Logistic Regression
2019
IEEE Access
INDEX TERMS Alternating Direction Method of Multipliers, big data, distributed parallel, sparse multinomial logistic regression. 55496 2169-3536 ...
We have reinvestigated the classification accuracy and running efficiency of the algorithm for solving SMLR problems using the Alternating Direction Method of Multipliers (ADMM), which is called fast SMLR ...
Since solving directly with the original logistic regression is prone to overfitting, the more common practice is to introduce regularization term [1] . ...
doi:10.1109/access.2019.2913280
fatcat:eeaxgs37rzafrax2fk3w5tlbmq
Robust Variable and Interaction Selection for Logistic Regression and Multiple Index Models
[article]
2017
arXiv
pre-print
We propose Stepwise cOnditional likelihood variable selection for Discriminant Analysis (SODA) to detect both main and quadratic interaction effects in logistic regression and quadratic discriminant analysis ...
Compared with existing variable selection methods based on the Sliced Inverse Regression (SIR) (Li 1991), SODA requires neither the linearity nor the constant variance condition and is much more robust ...
However, a direct application of Lasso on logistic regression with all second-order terms is prohibitive for moderately large p (e.g., p ≥ 1000). ...
arXiv:1611.08649v2
fatcat:onohvoa5zjgv5e2ls6h5cydblq
High Dimensional Logistic Regression Model using Adjusted Elastic Net Penalty
2015
Pakistan Journal of Statistics and Operation Research
with other existing penalized methods. ...
Reduction of the high dimensional binary classification data using penalized logistic regression is one of the challenges when the explanatory variables are correlated. ...
We end this paper with a conclusion in section 7.
Penalized Logistic Regression Model Logistic regression is a statistical method to model a binary classification problem. ...
doi:10.18187/pjsor.v11i4.990
fatcat:gsz7tcprybgefm34kfebidbqhm
ADMM-SOFTMAX : An ADMM Approach for Multinomial Logistic Regression
[article]
2019
arXiv
pre-print
We present ADMM-Softmax, an alternating direction method of multipliers (ADMM) for solving multinomial logistic regression (MLR) problems. ...
Our method is geared toward supervised classification tasks with many examples and features. It decouples the nonlinear optimization problem in MLR into three steps that can be solved efficiently. ...
We also thank the Isaac Newton Institute (INI) for Mathematical Sciences for the support and hospitality during the programme on Generative Models, Parameter ...
arXiv:1901.09450v2
fatcat:lj2stje43fgwjipfhueerbdxhi
Network-regularized Sparse Logistic Regression Models for Clinical Risk Prediction and Biomarker Discovery
[article]
2016
arXiv
pre-print
Here, we first introduce a general regularized Logistic Regression (LR) framework with regularized term λw_1 + ηw^TMw, which can reduce to different penalties, including Lasso, elastic net, and network-regularized ...
And we develop two efficient algorithms to solve it. Finally, we test our methods and compare them with the related ones using simulated and real data to show their efficiency. ...
Logistic regression (LR) is one of such a classical method and has been widely used for classification [13] . ...
arXiv:1609.06480v1
fatcat:ugc5gv47lncxrgnbpwarxnllli
The Generalized LASSO
2004
IEEE Transactions on Neural Networks
For regression functionals which can be modeled as iteratively reweighted least-squares (IRLS) problems, we present a highly efficient algorithm with guaranteed global convergence. ...
This defies a unique framework for sparse regression models in the very rich class of IRLS models, including various types of robust regression models and logistic regression. ...
This leads us to the LASSO method which optimizes a -constrained regression functional. ...
doi:10.1109/tnn.2003.809398
pmid:15387244
fatcat:65x4jj3hkfaohc5x3gxibz3dxu
The Impact of Regularization on High-dimensional Logistic Regression
[article]
2019
arXiv
pre-print
Logistic regression is commonly used for modeling dichotomous outcomes. ...
Our results generalize those of Sur and Candes and we provide a detailed study for the cases of ℓ_2^2-RLR and sparse (ℓ_1-regularized) logistic regression. ...
Summary of contributions In this paper, we study regularized logistic regression (RLR) for parameter estimation in high-dimensional logistic models. ...
arXiv:1906.03761v4
fatcat:darxyqznd5fung73zqdvmh63am
Estimation of Distribution Algorithms as Logistic Regression Regularizers of Microarray Classifiers
2009
Methods of Information in Medicine
Methods: Those difficulties are tackled by introducing a new way of regularizing the logistic regression. ...
Obtaining the regularized estimates of the logistic classifier amounts to maximizing the likelihood function via our EDA, without having to be penalized. ...
Acknowledgments The authors are grateful to the referees for their constructive comments. ...
doi:10.3414/me9223
pmid:19387512
fatcat:jo7nlzwd5bhv7bebx4h7chmrei
« Previous
Showing results 1 — 15 out of 2,076 results