Filters








2,599 Hits in 4.4 sec

Fast Parallel Randomized Algorithm for Nonnegative Matrix Factorization with KL Divergence for Large Sparse Datasets [article]

Duy Khuong Nguyen, Tu Bao Ho
2016 arXiv   pre-print
Nonnegative Matrix Factorization (NMF) with Kullback-Leibler Divergence (NMF-KL) is one of the most significant NMF problems and equivalent to Probabilistic Latent Semantic Indexing (PLSI), which has been  ...  Specially, sparse models provide more concise understanding of the appearance of attributes over latent components, while sparse representation provides concise interpretability of the contribution of  ...  THEORETICAL ANALYSIS In this section, we analyze the convergence and complexity of Algorithm 1 and Algorithm 2.  ... 
arXiv:1604.04026v1 fatcat:ws237xmglfcuvaefqosfi733zi

Accelerated parallel and distributed algorithm using limited internal memory for nonnegative matrix factorization

Duy Khuong Nguyen, Tu Bao Ho
2016 Journal of Global Optimization  
Nonnegative matrix factorization (NMF) is a powerful technique for dimension reduction, extracting latent factors and learning part-based representation.  ...  sub-space of passive variables, where r is the number of latent components, and µ and L are bounded as 1 2 ≤ µ ≤ L ≤ r.  ...  Acknowledgement This work was supported by Asian Office of Aerospace R&D under agreement number FA2386-15-1-4006; and 911 Scholarship from Vietnam Ministry of Education and Training.  ... 
doi:10.1007/s10898-016-0471-z fatcat:x6nyvaic5fhaxk5r7mf6hxvm6m

Cross-Domain Recommendation via Cluster-Level Latent Factor Model [chapter]

Sheng Gao, Hao Luo, Da Chen, Shantao Li, Patrick Gallinari, Jun Guo
2013 Lecture Notes in Computer Science  
dataset in the target domain of interest, which typically assume that multiple domains share the latent common rating pattern based on the user-item co-clustering.  ...  In fact, there exists a considerable number of publicly available user-item rating datasets from multiple domains, which could have dependencies and correlations among the domains.  ...  Convergence Analysis Based on the above updating rules for learning different latent factors, we can prove that the learning algorithm is convergent. Theorem 1.  ... 
doi:10.1007/978-3-642-40991-2_11 fatcat:nqaltxr7evekrnw6pkgdlnab5i

Accelerated Parallel and Distributed Algorithm using Limited Internal Memory for Nonnegative Matrix Factorization [article]

Duy-Khuong Nguyen, Tu-Bao Ho
2015 arXiv   pre-print
Nonnegative matrix factorization (NMF) is a powerful technique for dimension reduction, extracting latent factors and learning part-based representation.  ...  The proposed algorithm takes advantages of both these algorithms to achieve a linear convergence rate of O(1-1/||Q||_2)^k in optimizing each factor matrix when fixing the other factor one in the sub-space  ...  Acknowledgement This work was supported by Asian Office of Aerospace R&D under agreement number FA2386-13-1-4046; and 911 Scholarship from Vietnam Ministry of Education and Training.  ... 
arXiv:1506.08938v1 fatcat:hnetbeqcxjgcrgl732gckhxkci

Smooth nonnegative matrix and tensor factorizations for robust multi-way data analysis

Tatsuya Yokota, Rafal Zdunek, Andrzej Cichocki, Yukihiko Yamashita
2015 Signal Processing  
Moreover, we extend the proposed approach to the smooth nonnegative Tucker decomposition and smooth nonnegative canonical polyadic decomposition (also called smooth nonnegative tensor factorization).  ...  In this paper, we discuss new efficient algorithms for nonnegative matrix factorization (NMF) with smoothness constraints imposed on nonnegative components or factors.  ...  Furthermore, the GRBF-based methods work well for the nonnegative Tucker and CP models with both single and multi-way smooth representations.  ... 
doi:10.1016/j.sigpro.2015.02.003 fatcat:mhmmoff3tzgbraff2lrcydo4uq

Scaling the Indian Buffet Process via Submodular Maximization [article]

Colorado Reed, Zoubin Ghahramani
2013 arXiv   pre-print
Inference for latent feature models is inherently difficult as the inference space grows exponentially with the size of the input data and number of latent features.  ...  Our inference method scales linearly with the size of the input data, and we show the efficacy of our method on the largest datasets currently analyzed using an IBP model.  ...  Acknowledgements: CR was supported by the Winston Churchill Foundation of the United States, and Scaling the Indian Buffet Process via Submodular Maximization ZG was supported by EPSRC grant EP/I036575  ... 
arXiv:1304.3285v4 fatcat:vppxarbnjravtflhpv7fkqqxxu

A Cross-Domain Recommendation Model for Cyber-Physical Systems

Sheng Gao, Hao Luo, Da Chen, Shantao Li, Patrick Gallinari, Zhanyu Ma, Jun Guo
2013 IEEE Transactions on Emerging Topics in Computing  
INDEX TERMS Cyber-physical systems, cross-domain recommendation, latent factor model, rating patterns. 384 2168-6750 2013 IEEE.  ...  An increased dependence on CPS led to the collection of a vast amount of human-centric data, which brings the information overload problem across multiple domains.  ...  recommendation models: • NMF (Nonnegative Matrix Factorization) based model [12] : A single-domain model which employs nonnegative matrix factorization method to learn the latent factors in each domain  ... 
doi:10.1109/tetc.2013.2274044 fatcat:ky7aquotxfavjovg266l7krgai

Online Continuous-Time Tensor Factorization Based on Pairwise Interactive Point Processes

Hongteng Xu, Dixin Luo, Lawrence Carin
2018 Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence  
We model such data based on pairwise interactive point processes, and the proposed framework connects pairwise tensor factorization with a feature-embedded point process.  ...  A continuous-time tensor factorization method is developed for event sequences containing multiple "modalities."  ...  Acknowledgments The work presented here was supported in part by DARPA, DOE, NIH, NSF and ONR.  ... 
doi:10.24963/ijcai.2018/403 dblp:conf/ijcai/XuLC18 fatcat:3mrtbkczi5ejnchaxkk2vcwsdi

Nonnegative Matrix Factorization: A Comprehensive Review

Yu-Xiong Wang, Yu-Jin Zhang
2013 IEEE Transactions on Knowledge and Data Engineering  
It incorporates the nonnegativity constraint and thus obtains the parts-based representation as well as enhancing the interpretability of the issue correspondingly.  ...  Nonnegative Matrix Factorization (NMF), a relatively novel paradigm for dimensionality reduction, has been in the ascendant since its inception.  ...  Le Li and the reviewers for their helpful comments and suggestions. This work was supported by National Natural Science Foundation of China under Grant 61171118.  ... 
doi:10.1109/tkde.2012.51 fatcat:ocxepl7gdrhszawqhsj36qhpme

Speech Enhancement Using an Iterative Posterior NMF [chapter]

Sunnydayal Vanambathina
2019 New Frontiers in Brain-Computer Interfaces [Working Title]  
The spectral components of speech and noise are modeled as Gamma and Rayleigh, respectively.  ...  A speech enhancement method based on regularized nonnegative matrix factorization (NMF) for nonstationary Gaussian noise is proposed.  ...  matrix factorization is a process that approximates a single nonnegative matrix as the product of two nonnegative matrices.  ... 
doi:10.5772/intechopen.84976 fatcat:3iwcqaim2nedhfpilpn4te2lke

Simplicial nonnegative matrix factorization

Duy Khuong Nguyen, Khoat Than, Tu Bao Ho
2013 The 2013 RIVF International Conference on Computing & Communication Technologies - Research, Innovation, and Vision for Future (RIVF)  
Nonnegative matrix factorization (NMF) plays a crucial role in machine learning and data mining, especially for dimension reduction and component analysis.  ...  After a decade of fast development, severe limitations still remained in NMFs methods including high complexity in instance inference, hard to control sparsity or to interpret the role of latent components  ...  This formulation considers NMF as component analysis, in which each data instance is modeled as a convex combination of latent components.  ... 
doi:10.1109/rivf.2013.6719865 dblp:conf/rivf/NguyenTH13 fatcat:5mmg3vrxz5fbxgkwc66wso6erm

Kullback-Leibler Principal Component for Tensors is not NP-hard [article]

Kejun Huang, Nicholas D. Sidiropoulos
2017 arXiv   pre-print
We study the problem of nonnegative rank-one approximation of a nonnegative tensor, and show that the globally optimal solution that minimizes the generalized Kullback-Leibler divergence can be efficiently  ...  For generalized KL approximation with higher ranks, the problem is for the first time shown to be equivalent to multinomial latent variable modeling, and an iterative algorithm is derived that resembles  ...  In a lot of applications, nonnegativity constraints are natural for tensor latent factors as well.  ... 
arXiv:1711.07925v1 fatcat:5sm4ty4xyjaadgttjddoshrrn4

A Unified Probabilistic Model for Learning Latent Factors and Their Connectivities from High-Dimensional Data [article]

Ricardo Pio Monti, Aapo Hyvärinen
2018 arXiv   pre-print
We propose an efficient estimation algorithm based on score matching, and prove the identifiability of the model.  ...  The model is essentially a factor analysis model where the factors are allowed to have arbitrary correlations, while the factor loading matrix is constrained to express a community structure.  ...  We note that the introduction of marginally dependent latent variables is not possible in the context of traditional factor analysis, since the effects of factor connectivity and factor loadings cannot  ... 
arXiv:1805.09567v1 fatcat:nsobz6ot6zhq7kqeho7t6ju3ly

Static and Dynamic Source Separation Using Nonnegative Factorizations: A unified view

Paris Smaragdis, Cedric Fevotte, Gautham J. Mysore, Nasser Mohammadiha, Matthew Hoffman
2014 IEEE Signal Processing Magazine  
USING NONNEGATIVE FACTORIZATION MODELS FOR SEPARATION The basic model we will use to get started is a bilinear factorization of a nonnegative input V into two nonnegative matrices W and , H i.e., , V WH  ...  Their multiplicative structure automatically ensures the nonnegativity of the updates given positive initialization.  ... 
doi:10.1109/msp.2013.2297715 fatcat:mlxiskdplradtck7zmi4jld7tm

Second-order Symmetric Non-negative Latent Factor Analysis [article]

Weiling Li, Xin Luo
2022 arXiv   pre-print
Aiming at addressing this issue, this study proposes to incorporate an efficient second-order method into SNLF, thereby establishing a second-order symmetric non-negative latent factor analysis model for  ...  The undirected network representation task can be efficiently addressed by a symmetry non-negative latent factor (SNLF) model, whose objective is clearly non-convex.  ...  of the Hessian matrix, which can be huge in (SNLF) model bases on the idea of single element undirected network representation tasks.  ... 
arXiv:2203.02088v1 fatcat:qmrkbtn6zzhctnkpguka5belv4
« Previous Showing results 1 — 15 out of 2,599 results