227,209 Hits in 2.9 sec

Adaptive Reduced Rank Regression [article]

Qiong Wu, Felix Ming Fai Wong, Zhenming Liu, Yanhua Li, Varun Kanade
2020 arXiv   pre-print
We study the low rank regression problem $\my = M\mx + \epsilon$, where $\mx$ and $\my$ are $d_1$ and $d_2$ dimensional vectors respectively.  ...  Existing algorithms are designed for settings where $n$ is typically as large as $\Rank(M)(d_1+d_2)$.  ...  Our baselines include ridge regression ("Ridge"), reduced rank ridge regression [31] ("Reduced ridge"), LASSO ("Lasso"), nuclear norm regularized regression ("Nuclear norm"), and reduced rank regression  ... 
arXiv:1905.11566v3 fatcat:wscpkr4s5rfkhef4uok7pqga7e

Reduced rank regression via adaptive nuclear norm penalization [article]

Kun Chen, Hongbo Dong, Kung-Sik Chan
2012 arXiv   pre-print
Adaptive nuclear-norm penalization is proposed for low-rank matrix approximation, by which we develop a new reduced-rank estimation method for the general high-dimensional multivariate regression problems  ...  This new reduced-rank estimator is computationally efficient, has continuous solution path and possesses better bias-variance property than its classical counterpart.  ...  Table 3 : 3 Performances of reduced-rank estimators on breast cancer data. Setting 1 regresses GEPs on CNVs, and setting 2 regresses CNVs on GEPs.  ... 
arXiv:1201.0381v2 fatcat:hlzdvxqdevcfxcnqy4xccog6ca

Adaptive Estimation in Two-way Sparse Reduced-rank Regression [article]

Zhuang Ma, Zongming Ma, Tingni Sun
2016 arXiv   pre-print
This paper studies the problem of estimating a large coefficient matrix in a multiple response linear regression model when the coefficient matrix could be both of low rank and sparse in the sense that  ...  In what follows, we refer to model (1) with these structures as the two-way sparse reduced-rank regression model.  ...  Model (1) with such a structure has been referred to as reduced-rank regression and has been widely used in econometrics.  ... 
arXiv:1403.1922v2 fatcat:qnoeaofavbbktobbuahrt6hf6a

Asymptotic properties of adaptive group Lasso for sparse reduced rank regression

Kejun He, Jianhua Z. Huang
2016 Stat  
This paper studies the asymptotic properties of the penalized least squares estimator using an adaptive group Lasso penalty for the reduced rank regression.  ...  The group Lasso penalty is defined in the way that the regression coefficients corresponding to each predictor are treated as one group.  ...  Reduced rank regression (Izenman, 1975) is an effective way of taking into account the possible interrelationships between the response variables by imposing a constraint on the rank of C to be less  ... 
doi:10.1002/sta4.123 fatcat:shwuv3n3xjcgppdcjyq3ly5mh4

Reduced rank regression via adaptive nuclear norm penalization

K. Chen, H. Dong, K.-S. Chan
2013 Biometrika  
We propose an adaptive nuclear norm penalization approach for low-rank matrix approximation, and use it to develop a new reduced rank estimation method for high-dimensional multivariate regression.  ...  However, we show that the proposed non-convex penalized regression method has a global optimal solution obtained from an adaptively soft-thresholded singular value decomposition.  ...  We modified the preceding crossvalidation procedure for comparing the reduced rank subset regression methods.  ... 
doi:10.1093/biomet/ast036 pmid:25045172 pmcid:PMC4101086 fatcat:mhtbt6nucrcarlxds3vcwzh47m

Reducing Racial Bias in Facial Age Prediction using Unsupervised Domain Adaptation in Regression [article]

Apoorva Gokhale, Astuti Sharma, Kaustav Datta, Savyasachi
2021 arXiv   pre-print
We experiment extensively and compare various domain adaptation approaches for the task of regression.  ...  difference and the rank of the first identity with respect to the other in terms of their ages.  ...  /Regression Target Loss Rank + Regression 17.87 Regression 18.67  ... 
arXiv:2104.01781v1 fatcat:uwecbefu3jcydfanir7ke5kjei

High-Dimensional Uncertainty Quantification via Active and Rank-Adaptive Tensor Regression [article]

Zichang He, Zheng Zhang
2021 arXiv   pre-print
This paper proposes a novel tensor regression method to address these two challenges.  ...  This issue was mitigated recently by low-rank tensor methods.  ...  The proposed tensor regression framework is compared with a fixed rank method, a random and an adaptive exploration sampling method (shown in Fig. 2) .  ... 
arXiv:2009.01993v3 fatcat:yevgzw44zzeonewuffgxjmfnka

High-Dimensional Uncertainty Quantification via Tensor Regression with Rank Determination and Adaptive Sampling [article]

Zichang He, Zheng Zhang
2021 arXiv   pre-print
We also propose a two-stage adaptive sampling method to reduce the simulation cost.  ...  Recently, low-rank tensor methods have been developed to mitigate this issue, but two fundamental challenges remain open: how to automatically determine the tensor rank and how to adaptively pick the informative  ...  We propose a two-stage adaptive sampling method to reduce the simulation cost.  ... 
arXiv:2103.17236v2 fatcat:pxkd53ojnval5dnp5nw7yajmgi

Investigating online low-footprint speaker adaptation using generalized linear regression and click-through data

Yong Zhao, Jinyu Li, Jian Xue, Yifan Gong
2015 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
We show that this adaptation technique can be formulated in a linear regression fashion, analogous to other speak adaptation algorithms that apply additional linear transformations to the DNN layers.  ...  In this paper, we propose a novel low-footprint adaptation technique for DNN that adapts the DNN model through node activation functions.  ...  the number of parameters is reduced from mn to (m + n)k.  ... 
doi:10.1109/icassp.2015.7178784 dblp:conf/icassp/ZhaoLXG15 fatcat:btba63cxlfbyjbqwob2w5g2aye

Deep Ranking with Adaptive Margin Triplet Loss [article]

Mai Lan Ha, Volker Blanz
2021 arXiv   pre-print
We propose a simple modification from a fixed margin triplet loss to an adaptive margin triplet loss.  ...  The adaptive margins only need to be computed once before the training, which is much less expensive than generating triplets after every epoch as in the fixed margin case.  ...  We compare the ranking SROCC performances among Mean Opinion Score (MOS) regression, triplet loss with fixed margin and our proposed triplet loss with adaptive margin.  ... 
arXiv:2107.06187v1 fatcat:un45mxyg7jf2xd65jmu4qzc44q

Kernel smoothing for jagged edge reduction

Mohammad Aghagolzadeh, Andrew Segall
2013 2013 IEEE International Conference on Acoustics, Speech and Signal Processing  
We consider the kernel regression framework and propose a reduced-rank quadratic adaptive method that adapts to the local gradient direction.  ...  Namely, that the method is effective in reducing jaggy artifacts without blurring meaningful image structure.  ...  We propose a rank-reduced regression that adapts to the local gradient attributes that are computed using robust measures of structure.  ... 
doi:10.1109/icassp.2013.6638100 dblp:conf/icassp/AghagolzadehS13 fatcat:cqxxy6lwozh4hn5dxsfrfbk2di

A note on rank reduction in sparse multivariate regression

Kun Chen, Kung-Sik Chan
2015 Journal of Statistical Theory and Practice  
A reduced-rank regression with sparse singular value decomposition (RSSVD) approach was proposed by Chen et al. for conducting variable selection in a reduced-rank model.  ...  Here, we generalize the method to also perform rank reduction, and enable its usage in reduced-rank vector autoregressive (VAR) modeling to perform automatic rank determination and order selection.  ...  The reduced-rank regression estimator was used as the initial estimator for conducting adaptive regularized estimation, and the method is shown to be robust against overspecification of the initial rank  ... 
doi:10.1080/15598608.2015.1081573 pmid:26997938 pmcid:PMC4797956 fatcat:nspjrdbz3rg47bblolvpgmbuoq

Assessing climate-induced agricultural vulnerable coastal communities of Bangladesh using machine learning techniques

Md. Jakariya, Md. Sajadul Alam, Md. Abir Rahman, Silvia Ahmed, M.M. Lutfe Elahi, Abu Mohammad Shabbir Khan, Saman Saad, H.M. Tamim, Taoseef Ishtiak, Sheikh Mohammad Sayem, Mirza Shawkat Ali, Dilruba Akter
2020 Science of the Total Environment  
The vulnerability factors with the lowest rank to the second-highest rank and so on were dropped one by one and trained new linear and Bayesian ridge regression models with continuously reducing sets of  ...  To get a unified ranking, the ranks produced by the two regression models were summed up and sorted, with the factors in ascending order according to their sum of ranks.  ... 
doi:10.1016/j.scitotenv.2020.140255 pmid:32721709 pmcid:PMC7297150 fatcat:vxqarqinebcuxcxhqwg4mt6acu

Quality Assessment of Web Services Using Multivariate Adaptive Regression Splines

Lov Kumar, Santanu Kumar Rath
2015 2015 Asia-Pacific Software Engineering Conference (APSEC)  
In this paper, nine parameters of QoS have been considered as input for design a model using multivariate adaptive regression splines (MARS) to select suitable web service.  ...  These may be able to classify the web services with higher accuracy and also reduced the value of misclassification errors.  ...  adaptive regression splines (MARS).  ... 
doi:10.1109/apsec.2015.35 dblp:conf/apsec/KumarR15 fatcat:5gsfqlqgpffdnpjvxjins5ngs4

Disjoint Contrastive Regression Learning for Multi-Sourced Annotations [article]

Xiaoqian Ruan, Gaoang Wang
2021 arXiv   pre-print
samples of the same annotator, with the assumption that the ranking of samples from the same annotator is unanimous.  ...  inconsistency and bias among different annotators are harmful to the model training, especially for qualitative and subjective tasks.To address this challenge, in this paper, we propose a novel contrastive regression  ...  with the regression baseline, i.e., a direct regression ing accuracy (PRA), Spearman’s rank order correlation co- with mean squared error (MSE) as the loss function.  ... 
arXiv:2112.15411v1 fatcat:xemfggpn4jh65aneklufkkqs7q
« Previous Showing results 1 — 15 out of 227,209 results