Filters








1,625 Hits in 2.7 sec

Tensor network complexity of multilinear maps [article]

Per Austrin, Petteri Kaski, Kaie Kubjas
2018 arXiv   pre-print
We study tensor networks as a model of arithmetic computation for evaluating multilinear maps.  ...  While powerful, the model still has limitations, and we are able to show a number of unconditional lower bounds for various multilinear maps, including: (a) an Ω(n^bw(P)) time lower bound for counting  ...  We can view a minimum-cost execution of D as a rooted tree such that the root has degree two, all non-root internal vertices have degree three, and each leaf vertex is a tensor of D.  ... 
arXiv:1712.09630v3 fatcat:z6b4rbp2b5b3rhunxocyobwjx4

Tensor Network Complexity of Multilinear Maps

Per Austrin, Petteri Kaski, Kaie Kubjas, Michael Wagner
2018 Innovations in Theoretical Computer Science  
We study tensor networks as a model of arithmetic computation for evaluating multilinear maps.  ...  While powerful, the model still has limitations, and we are able to show a number of unconditional lower bounds for various multilinear maps, including: (a) an Ω(n bw(P ) ) time lower bound for counting  ...  tensors in the network. α β γ A B i ′ k k ′ j ′ i j ℓ β γ B k ′ j ′ i j ℓ γ i j ℓ γ i j ℓ A · B i j (2) I T C S 2 0 1 9 7:4 Tensor Network Complexity of Multilinear Maps The cost of performing one of  ... 
doi:10.4230/lipics.itcs.2019.7 dblp:conf/innovations/AustrinKK19 fatcat:t4xmqd4rxfb27lgqzslpxtduli

Improving efficiency in convolutional neural networks with multilinear filters

Dat Thanh Tran, Alexandros Iosifidis, Moncef Gabbouj
2018 Neural Networks  
Instead of compressing a pre-trained network, in this work, we propose a generic neural network layer structure employing multilinear projection as the primary feature extractor.  ...  The excellent performance of deep neural networks has enabled us to solve several automatization problems, opening an era of autonomous devices.  ...  On the contrary, our proposed mapping is a special case of the general multilinear mapping using mode-k product in which the output tensor degenerates to a scalar.  ... 
doi:10.1016/j.neunet.2018.05.017 pmid:29920430 fatcat:gnvv5ldcdndslggu74uzbitkcu

Learning Multiple Tasks with Multilinear Relationship Networks [article]

Mingsheng Long, Zhangjie Cao, Jianmin Wang, Philip S. Yu
2017 arXiv   pre-print
This paper presents Multilinear Relationship Networks (MRN) that discover the task relationships based on novel tensor normal priors over parameter tensors of multiple task-specific layers in deep convolutional  ...  Since deep features eventually transition from general to specific along deep networks, a fundamental problem of multi-task learning is how to exploit the task relatedness underlying parameter tensors  ...  Acknowledgments This work was supported by the National Key R&D Program of China (2016YFB1000701), National Natural Science Foundation of China (61772299, 61325008, 61502265, 61672313) and TNList Fund.  ... 
arXiv:1506.02117v4 fatcat:2xsxavenr5h5rjj264qixxcjju

Tensor object classification via multilinear discriminant analysis network [article]

Rui Zeng, Jiasong Wu, Lotfi Senhadji, Huazhong Shu
2014 arXiv   pre-print
This paper proposes a multilinear discriminant analysis network (MLDANet) for the recognition of multidimensional objects, known as tensor objects.  ...  The MLDANet consists of three parts: 1) The encoder learned by MLDA from tensor data. 2) Features maps ob-tained from decoder. 3) The use of binary hashing and histogram for feature pooling.  ...  patch in the lth feature map of mth tensor object.  ... 
arXiv:1411.1172v1 fatcat:wxs26t2tcbhkzbuntdu5hq3dny

Convolutional Neural Network Feature Extraction Using Covariance Tensor Decomposition

Ricardo Fonseca, Oscar Guarnizo, Diego Suntaxi, Alfonso Cadiz, Werner Creixell
2021 IEEE Access  
Both kernels and feature maps are just a few examples of all the kernels that composed the network.  ...  His main research interests are machine learning and complex networks applied to different areas such as images, telecommunication networks, metabolic networks, and biological processes.  ...  These experiments also present a light improvement of performance to the conventional methods (see Figure 25 ). In this way, the training stage returns an accuracy of around 99%.  ... 
doi:10.1109/access.2021.3076033 fatcat:fbqxs3r5azd2dinntmgdah3ese

High-Order Pooling for Graph Neural Networks with Tensor Decomposition [article]

Chenqing Hua and Guillaume Rabusseau and Jian Tang
2022 arXiv   pre-print
CP decomposition to efficiently parameterize permutation-invariant multilinear maps for modeling node interactions.  ...  We propose the Tensorized Graph Neural Network (tGNN), a highly expressive GNN architecture relying on tensor decomposition to model high-order non-linear node interactions. tGNN leverages the symmetric  ...  Motivation and Method We leverage the symmetric CP decomposition to design an efficient parameterization of permutationinvariant multilinear maps for aggregation operations in graph neural networks, Tensorized  ... 
arXiv:2205.11691v1 fatcat:ltn4xqvcgfdo7pqgitufvalzqi

Multilinear Map Layer: Prediction Regularization by Structural Constraint [article]

Shuchang Zhou, Yuxin Wu
2015 arXiv   pre-print
The technique proceeds by replacing the output layer of neural network with the so-called MLM layers, which forces the output to be the result of some Multilinear Map, like a hybrid-Kronecker-dot product  ...  or Kronecker Tensor Product.  ...  the result of some kinds of multilinear map.  ... 
arXiv:1507.08429v1 fatcat:a4i2eq2asjgyfisztxkxww2yyu

Residual Tensor Train: a Flexible and Efficient Approach for Learning Multiple Multilinear Correlations [article]

Yiwei Chen, Yu Pan, Daoyi Dong
2021 arXiv   pre-print
Tensor Train (TT) approach has been successfully applied in the modelling of the multilinear interaction of features.  ...  Numerical experiments demonstrate that ResTT outperforms the state-of-the-art tensor network approaches, and is competitive with the benchmark deep learning models on MNIST and Fashion-MNIST datasets.  ...  Complexity Analysis Here we compare the memory complexity and time complexity of three multilinear models, including the fully general tensorized model in (5) , the plain TT in (3) and ResTT.  ... 
arXiv:2108.08659v1 fatcat:pgpdgp6tv5bvnhhagjdfdqlxtq

Expressive power of recurrent neural networks [article]

Valentin Khrulkov, Alexander Novikov, Ivan Oseledets
2018 arXiv   pre-print
Using theoretical results on the relation between the tensor decompositions we compare expressive powers of the HT- and TT-Networks.  ...  A certain class of deep convolutional networks -- namely those that correspond to the Hierarchical Tucker (HT) tensor decomposition -- has been proven to have exponentially higher expressive power than  ...  ACKNOWLEDGEMENTS This study was supported by the Ministry of Education and Science of the Russian Federation (grant 14.756.31.0001).  ... 
arXiv:1711.00811v2 fatcat:og7jncwknfgoxbmkckuoccbtwy

Multilinear class-specific discriminant analysis

Dat Thanh Tran, Moncef Gabbouj, Alexandros Iosifidis
2017 Pattern Recognition Letters  
In this paper, we propose a multilinear subspace learning technique suitable for applications requiring class-specific tensor models.  ...  The method maximizes the discrimination of each individual class in the feature space while retains the spatial structure of the input.  ...  the more complex neural network-based bag-of-words model N-BoF ( [27] ).  ... 
doi:10.1016/j.patrec.2017.10.027 fatcat:ghbivb6vore65hjanxo3cgl23u

A survey of multilinear subspace learning for tensor data

Haiping Lu, Konstantinos N. Plataniotis, Anastasios N. Venetsanopoulos
2011 Pattern Recognition  
This paper surveys the field of multilinear subspace learning (MSL) for dimensionality reduction of multidimensional data directly from their tensorial representations.  ...  It discusses the central issues of MSL, including establishing the foundations of the field via multilinear projections, formulating a unifying MSL framework for systematic treatment of the problem, examining  ...  Acknowledgment The authors would like to thank the anonymous reviewers for their insightful comments, which have helped to improve the quality of this paper.  ... 
doi:10.1016/j.patcog.2011.01.004 fatcat:6puqzxrohfawraeiyid6633wdm

Protecting Big Data Privacy Using Randomized Tensor Network Decomposition and Dispersed Tensor Computation [article]

Jenn-Bing Ong, Wee-Keong Ng, Ivan Tjuawinata, Chao Li, Jielin Yang, Sai None Myne, Huaxiong Wang, Kwok-Yan Lam, C.-C. Jay Kuo
2021 arXiv   pre-print
Our primary intuition is that tensor network representations are mathematically non-unique, unlinkable, and uninterpretable; tensor network representations naturally support a range of multilinear operations  ...  However, the potential of distributed tensor networks for big data privacy preservation have not been considered before, this motivates the current study.  ...  Tensor network (TN) represents a data or tensor block in a sparsely-interconnected, low-order core tensors (typically 3 -order or 4 ℎ -order tensors) and the functions by distributed, multilinear tensor  ... 
arXiv:2101.04194v1 fatcat:pay7hgk24jddnmvxrexefhd7di

Tensor Network Contractions for #SAT

Jacob D. Biamonte, Jason Morton, Jacob Turner
2015 Journal of statistical physics  
By these methods, we give an algorithm using an axiomatic tensor contraction language for n-variable #SAT instances with complexity O((g+cd)^O(1) 2^c) where c is the number of COPY-tensors, g is the number  ...  Thus, counting problems can be solved efficiently when their tensor network expression has at most O( c) COPY-tensors and polynomial fan-out.  ...  Tensor network methods are a collection of techniques to model and reason about multilinear maps.  ... 
doi:10.1007/s10955-015-1276-z fatcat:oohb72tt2jh7tbca6qwd6bctmm

Tensorial Kernel Principal Component Analysis for Action Recognition

Cong Liu, Xu Wei-sheng, Wu Qi-di
2013 Mathematical Problems in Engineering  
Our method aims to remedy the shortcomings of multilinear subspace learning (tensorial PCA) developed recently in modelling the nonlinear manifold of tensor objects and brings together the desirable properties  ...  Furthermore, a TKPCA-based tensor object recognition is also proposed for application of the action recognition.  ...  The objective of Multilinear Principal Component Analysis of Tensors (MPCA) [19] is to find a multilinear transformation {U ( ) ∈ R × } =1 that maps the original tensor space R 1 ⊗ R 2 ⊗ ⋅ ⋅ ⋅ ⊗ R into  ... 
doi:10.1155/2013/816836 fatcat:vevy27lysza3dk5f6vda456v7y
« Previous Showing results 1 — 15 out of 1,625 results