Filters








26,450 Hits in 6.2 sec

A novel extension of Generalized Low-Rank Approximation of Matrices based on multiple-pairs of transformations [article]

Soheil Ahmadi, Mansoor Rezghi
2019 arXiv   pre-print
Due to these issues, in recent years some methods like Generalized low-rank approximation of matrices (GLRAM) and Multilinear PCA (MPCA) are proposed which deal with the data in their own format.  ...  To overcome this drawback of multilinear methods like GLRAM, we proposed a new method which is a general form of GLRAM and by preserving the merits of it have a larger search space.  ...  Recently, an extension of the DR based on a low-rank approximation to matrix data named Generalized Low-rank Approximation of Matrices(GLRAM) is investigated.  ... 
arXiv:1808.10632v3 fatcat:6f54int4jbclbofjbkz7m7mfdq

Iterative Deep Model Compression and Acceleration in the Frequency Domain

Yao Zeng, Xusheng Liu, Lintan Sun, Wenzhong Li, Yuchu Fang, Sanglu Lu
2021 Asian Conference on Machine Learning  
Then we propose an iterative model compression method to decompose the frequency matrices with a samplebased low-rank approximation algorithm, and then fine-tune and recompose the low-rank matrices gradually  ...  Inspired by the sparsity and low-rank properties of weight matrices in the frequency domain, we propose a novel frequency pruning framework for model compression and acceleration while maintaining highperformance  ...  BE2018116), the Collaborative Innovation Center of Novel Software Technology and Industrialization. The corresponding author is Sanglu Lu.  ... 
dblp:conf/acml/ZengLSLFL21 fatcat:5uymhwvringtjb7s6sskiysl44

A particle-based variational approach to Bayesian Non-negative Matrix Factorization [article]

M. Arjumand Masood, Finale Doshi-Velez
2018 arXiv   pre-print
We address these issues through a particle-based variational approach to Bayesian NMF that only requires the joint likelihood to be differentiable for tractability, uses a novel initialization technique  ...  to identify multiple modes in the posterior, and allows domain experts to inspect a 'small' set of factorizations that faithfully represent the posterior.  ...  Acknowledgments We would like to thank Andreas Krause, Mike Hughes, Melanie Pradier, Nick Foti, Omer Gottesman and other members of the Dtak lab at Harvard for many helpful conversations and feedback.  ... 
arXiv:1803.06321v1 fatcat:g4hpi2ff25es7mjwck5mxoik2a

Transformations of matrix structures work again

Victor Y. Pan
2015 Linear Algebra and its Applications  
algorithm for the inversion of matrices having one of these structures to inverting the matrices with the structures of the three other types.  ...  Surprising power of this approach has been demonstrated in a number of works, which culminated in ingeneous numerically stable algorithms that approximated the solution of a nonsingular Toeplitz linear  ...  with low Hankel rank and rank structured matrices.  ... 
doi:10.1016/j.laa.2014.09.004 fatcat:hi2jma23jrbapmiibgzywljsfy

Transformations of Matrix Structures Work Again II [article]

Victor Y. Pan
2013 arXiv   pre-print
successful algorithm that inverts matrices of one of these classes to inverting matrices of the three other classes.  ...  We decrease the arithmetic cost of the known algorithms from quadratic to nearly linear, and similarly for the computations with the matrices of a more general class having structures of Vandermonde and  ...  Local low-rank approximation of CV matrices Theorem 29 defines a low-rank approximation of a CV matrix where its two knot sets are separated by a global center c.  ... 
arXiv:1311.3729v1 fatcat:duw5mjby2faczbu6mpcfdvliji

Transformations of Matrix Structures Work Again [article]

Victor Y. Pan
2013 arXiv   pre-print
algorithm for the inversion of matrices having one of these structures to inverting the matrices with the structures of the three other types.  ...  Surprising power of this approach has been demonstrated in a number of works, which culminated in ingeneous numerically stable algorithms that approximated the solution of a nonsingular Toeplitz linear  ...  with low Hankel rank and rank structured matrices.  ... 
arXiv:1303.0353v1 fatcat:ccy3epwqozduhjwobaksrhigee

Interpolative Butterfly Factorization

Yingzhou Li, Haizhao Yang
2017 SIAM Journal on Scientific Computing  
A preliminary interpolative butterfly factorization is constructed based on interpolative low-rank approximations of the complementary low-rank matrix.  ...  A novel sweeping matrix compression technique further compresses the preliminary interpolative butterfly factorization via a sequence of structure-preserving low-rank approximations.  ...  This section provides a brief description of the overall structure of the butterfly algorithm based on the interpolative low-rank approximation in the previous section.  ... 
doi:10.1137/16m1074941 fatcat:gsupanh3gfdgrhbiaubiba3fsq

Scatterbrain: Unifying Sparse and Low-rank Attention Approximation [article]

Beidi Chen, Tri Dao, Eric Winsor, Zhao Song, Atri Rudra, Christopher Ré
2021 arXiv   pre-print
Recent advances in efficient Transformers have exploited either the sparsity or low-rank properties of attention matrices to reduce the computational and memory bottlenecks of modeling long sequences.  ...  On a pre-trained T2T Vision transformer, even without fine-tuning, Scatterbrain can reduce 98% of attention memory at the cost of only 1% drop in accuracy.  ...  The Mobilize Center is a Biomedical Technology Resource Center, funded by the NIH National Institute of Biomedical Imaging and Bioengineering through Grant P41EB027060. The U.S.  ... 
arXiv:2110.15343v1 fatcat:ycfcx3fujzebng2zl4vfk3xr5e

A Flexible Lossy Depth Video Coding Scheme Based on Low-rank Tensor Modelling and HEVC Intra Prediction for Free Viewpoint Video [article]

Mansi Sharma, Santosh Kumar
2021 arXiv   pre-print
In this paper, we introduce a novel low-complexity scheme for depth video compression based on low-rank tensor decomposition and HEVC intra coding.  ...  Tensor factorization into a set of factor matrices following CANDECOMP PARAFAC (CP) decomposition via alternating least squares give a low-rank approximation of the scene geometry.  ...  In this paper, we introduce a novel low-complexity scheme for depth video compression based on low-rank tensor decomposition and HEVC intra coding.  ... 
arXiv:2104.04678v1 fatcat:qjadbghnqzckjcjkxilqp5tbve

A Hierarchical Approach for Lossy Light Field Compression With Multiple Bit Rates Based on Tucker Decomposition via Random Sketching

Joshitha Ravishankar, Mansi Sharma
2022 IEEE Access  
The complete end-to-end processing pipeline can flexibly work for multiple bitrates and is adaptable for a variety of multi-view autostereoscopic platforms.  ...  The compression performance of the proposed scheme is analyzed on real light fields.  ...  A pair of steps to generate noise-refined depth maps for selected perspective views was elaborated in [35] .  ... 
doi:10.1109/access.2022.3177601 fatcat:tt44auspzjc7leiz54bcoux4ey

Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition [article]

Vasileios Lioutas, Ahmad Rashid, Krtin Kumar, Md Akmal Haidar, Mehdi Rezagholizadeh
2020 arXiv   pre-print
In this paper, we propose Distilled Embedding, an (input/output) embedding compression method based on low-rank matrix decomposition and knowledge distillation.  ...  Word-embeddings are vital components of Natural Language Processing (NLP) models and have been extensively explored. However, they consume a lot of memory which poses a challenge for edge deployment.  ...  Low-rank Factorization Low-rank approximation of weight matrices, using SVD, is a natural way to compress deep learning based NLP models.  ... 
arXiv:1910.06720v2 fatcat:7za6klgsrjdknbtc6da4hjoqtq

Face identification with top-push constrained generalized low-rank approximation of matrices

Yuanjian Chen, Yanna Zhao, Yunlong He, Fangzhou Xu, Weikuan Jia, Jian Lian, Yuanjie Zheng
2019 IEEE Access  
To this end, we formulate the learning process under the framework of generalized low-rank approximation of matrices (GLRAM) supervised with a top-push constraint.  ...  INDEX TERMS Feature learning, generalized low-rank approximation, face identification, top-push. 160998 This work is licensed under a Creative Commons Attribution 4.0 License.  ...  The TFL possesses various attractive merits. 1) The generalized low-rank approximation of matrices takes a sequence of matrices as the input.  ... 
doi:10.1109/access.2019.2947164 fatcat:kpvcjfpferd7pejzirwmr4s3z4

Scaling Private Deep Learning with Low-Rank and Sparse Gradients [article]

Ryuichi Ito, Seng Pei Liew, Tsubasa Takahashi, Yuya Sasaki, Makoto Onizuka
2022 arXiv   pre-print
The gradient updates are first approximated with a pair of low-rank matrices.  ...  Applying Differentially Private Stochastic Gradient Descent (DPSGD) to training modern, large-scale neural networks such as transformer-based models is a challenging task, as the magnitude of noise added  ...  Initially, the gradient updates are approximated with a pair of low-rank matrices.  ... 
arXiv:2207.02699v1 fatcat:mtz2gd4k5zdy3lztobsfj4soa4

Low-rank Approximation of a Matrix: Novel Insights, New Progress, and Extensions [article]

Victor Y. Pan, Liang Zhao
2016 arXiv   pre-print
Low-rank approximation of a matrix by means of random sampling has been consistently efficient in its empirical studies by many scientists who applied it with various sparse and structured multipliers,  ...  We also outline extensions of low-rank approximation algorithms and of our progress to the Least Squares Regression, the Fast Multipole Method, and the Conjugate Gradient algorithms.  ...  Parts (i) and (ii) of Theorem 4.1 imply parts (i) of Theorems 1.3 and 1.4.4.2 From low-rank representation to low-rank approximation: a basic stepOur extension of the above results to the proof of Theorems  ... 
arXiv:1510.06142v6 fatcat:7e3zquvpyjb2nh3ycr6433xsem

EnHiC: learning fine-resolution Hi-C contact maps using a generative adversarial framework

Yangyang Hu, Wenxiu Ma
2021 Bioinformatics  
Results In this work, we propose a novel method, EnHiC, for predicting high-resolution Hi-C matrices from low-resolution input data based on a generative adversarial network (GAN) framework.  ...  Inspired by non-negative matrix factorization, our model fully exploits the unique properties of Hi-C matrices and extracts rank-1 features from multi-scale low-resolution matrices to enhance the resolution  ...  National Institute of Health [R35GM133678]. Conflict of Interest: none declared.  ... 
doi:10.1093/bioinformatics/btab272 pmid:34252966 fatcat:gkjdr5mqlvfyhhjqhx2eextani
« Previous Showing results 1 — 15 out of 26,450 results