A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Tensor Decompositions for Modeling Inverse Dynamics
[article]
2017
arXiv
pre-print
Furthermore, we extend the method to continuous inputs, by learning a mapping from the continuous inputs to the latent representations of the tensor decomposition, using basis functions. ...
The decomposition of sparse tensors has successfully been used in relational learning, e.g., the modeling of large knowledge graphs. ...
By learning a representation for each possible value of all sensors, the decomposition allows for approximating highly non-linear functions. ...
arXiv:1711.04683v1
fatcat:evndrzxwwzhqlaowgby5tm7ope
Tensor Computing for Internet of Things (Dagstuhl Perspectives Workshop 16152)
2016
Dagstuhl Reports
Internet of Things (IoT) or Cyber-physical systems (CPS) bring out interesting new challenges to tensor computing, such as the need for real-time analytics and control in interconnected dynamic networks ...
This report documents the program and the outcomes of Dagstuhl Perspectives Workshop 16152 "Tensor Computing for Internet of Things". ...
The goal of the workshop was to explore tensor representations and computing as the basis for machine learning solutions for the IoT. ...
doi:10.4230/dagrep.6.4.57
dblp:journals/dagstuhl-reports/AcarAMRT16
fatcat:psomri4q4faapbjc4dot7ca3em
Tensor Decompositions in Deep Learning
[article]
2020
arXiv
pre-print
The paper surveys the topic of tensor decompositions in modern machine learning applications. It focuses on three active research topics of significant relevance for the community. ...
After a brief review of consolidated works on multi-way data analysis, we consider the use of tensor decompositions in compressing the parameter space of deep learning models. ...
To this end, the paper shows the use of a Tucker decomposition for discrete data. The paper also releases the original data collected and used for the analysis. ...
arXiv:2002.11835v1
fatcat:izu4qtizqbghhnaxlhisi2tnce
Physics-Informed Tensor-Train ConvLSTM for Volumetric Velocity Forecasting of the Loop Current
2021
Frontiers in Artificial Intelligence
tensor-train decomposition to capture higher-order space-time correlations, and (3) a mechanism that incorporates prior physics from domain experts by informing the learning in latent space. ...
The advantage of our proposed approach is clear: constrained by the law of physics, the prediction model simultaneously learns good representations for frame dependencies (both short-term and long-term ...
representation learning (e.g., the hidden states of the deep neural networks). ...
doi:10.3389/frai.2021.780271
pmid:35005615
pmcid:PMC8741277
fatcat:ljz4h74skfbn7n3vslfpactszu
Decomposition Methods for Machine Learning with Small, Incomplete or Noisy Datasets
2020
Applied Sciences
In other cases, and for different reasons, the datasets are originally small, and therefore, more data samples are required to derive useful supervised or unsupervised classification methods. ...
We show that a signal decomposition approach can provide valuable tools to improve machine learning performance with low quality datasets. ...
Acknowledgments: We are grateful to the anonymous reviewers for their valuable comments, which helped us to improve the first version of this manuscript. ...
doi:10.3390/app10238481
fatcat:2gqm3tos4vdorptqayewqu2mum
Augmentation in Healthcare: Augmented Biosignal Using Deep Learning and Tensor Representation
2021
Journal of Healthcare Engineering
In healthcare applications, deep learning is a highly valuable tool. It extracts features from raw data to save time and effort for health practitioners. ...
The proposed model provides an appropriate representation of the input raw biosignal that boosts the accuracy of training and testing dataset. ...
On the other hand, the tensor representation was used in different research fields to allow for a better representation of the dataset. ...
doi:10.1155/2021/6624764
pmid:33575018
pmcid:PMC7861952
fatcat:77re7g5sdrcenj5n73vww6xmdy
Protecting Big Data Privacy Using Randomized Tensor Network Decomposition and Dispersed Tensor Computation
[article]
2021
arXiv
pre-print
Tensor network decomposition and distributed tensor computation have been widely used in signal processing and machine learning for dimensionality reduction and large-scale optimization. ...
Therefore, we propose randomized algorithms to decompose big data into randomized tensor network representations and analyze the privacy leakage for 1D to 3D data tensors. ...
Big data generated from sensor networks or Internet-of-Things are essential for machine learning, in particular deep learning, in order to train cutting-edge intelligent systems for real-time decision ...
arXiv:2101.04194v1
fatcat:pay7hgk24jddnmvxrexefhd7di
2020 Index IEEE Transactions on Signal Processing Vol. 68
2020
IEEE Transactions on Signal Processing
., One-Step Prediction for Discrete Time-Varying Nonlinear Systems With Unknown Inputs and Correlated Noises; TSP ...
.,
+, TSP 2020 1-16
Image representation
A Low-Rank Tensor Dictionary Learning Method for Hyperspectral Image
Denoising. ...
Zhang, H., +, TSP 2020 1021-1033
Biology computing
Tensor Graph Convolutional Networks for Multi-Relational and Robust
Learning. ...
doi:10.1109/tsp.2021.3055469
fatcat:6uswtuxm5ba6zahdwh5atxhcsy
Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions
2016
Foundations and Trends® in Machine Learning
It is therefore timely and valuable for the multidisciplinary research community to review tensor decompositions and tensor networks as emerging tools for large-scale data analysis and data mining. ...
We provide the mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations ...
Tensor Notations and Graphical Representations 11
Notations and terminology used for tensors and tensor networks
differ across the scientific communities (see Table 1.2); to this ...
doi:10.1561/2200000059
fatcat:ememscddezeovamsoqrcpp33z4
Multi-way Graph Signal Processing on Tensors: Integrative analysis of irregular geometries
[article]
2020
arXiv
pre-print
In this paper, we review modern signal processing frameworks generalizing GSP to multi-way data, starting from graph signals coupled to familiar regular axes such as time in sensor networks, and then extending ...
Graph signal processing (GSP) is an important methodology for studying data residing on irregular structures. ...
CP decomposition is a sum of rank-1 tensors. There are multiple operations to reshape tensors, used for convenient calculations. ...
arXiv:2007.00041v2
fatcat:i2e77o5njrhkpfoxtdswdkpibm
Decomposing Temporal High-Order Interactions via Latent ODEs
2022
International Conference on Machine Learning
To capture the complex temporal dynamics, we use a neural network (NN) to learn the time derivative of the ODE state. ...
The existent methods either discard the timestamps or convert them into discrete steps or use over-simplistic decomposition models. ...
While tensor decomposition (Tucker, 1966; Harshman, 1970; Chu and Ghahramani, 2009; Choi and Vishwanathan, 2014; Zhe et al., 2016b ) is a popular framework for representation learning and prediction of ...
dblp:conf/icml/LiKZ22
fatcat:elw76oifrralzl4y6jiedewya4
2021 Index IEEE Journal of Selected Topics in Signal Processing Vol. 15
2021
IEEE Journal on Selected Topics in Signal Processing
The Author Index contains the primary entry for each item, listed under the first author's name. ...
., +, JSTSP Aug. 2021 1272-1287 Tensor Decomposition Learning for Compression of Multidimensional Signals. ...
., +, JSTSP Feb. 2021 415-430 Decomposition Tensor Decomposition Learning for Compression of Multidimensional Sig-2021 1258-1271 Degradation Editorial: Introduction to the Issue on Deep Learning for Image ...
doi:10.1109/jstsp.2021.3135675
fatcat:pofbfingjbc7dhn5i7mtqbebuy
Long-term Forecasting using Higher Order Tensor RNNs
[article]
2019
arXiv
pre-print
Furthermore, we decompose the higher-order structure using the tensor-train decomposition to reduce the number of parameters while preserving the model performance. ...
Our proposed recurrent architecture addresses these issues by learning the nonlinear dynamics directly using higher-order moments and higher-order state transition functions. ...
These results suggest that tensor-train neural networks learn more stable representations that generalize better for long-term horizons. ...
arXiv:1711.00073v3
fatcat:d326672govb7jbsaj3x7bmxav4
Tensor-based anomaly detection: An interdisciplinary survey
2016
Knowledge-Based Systems
This survey aims to highlight the potential of tensor-based techniques as a novel approach for detection and identification of abnormalities and failures. ...
Traditional spectral-based methods such as PCA are popular for anomaly detection in a variety of problems and domains. ...
Sensors One of the potential applications of tensors is anomaly detection in sensor networks which uses the same tensor model as environmental monitoring differing in the speed by which sensor gather data ...
doi:10.1016/j.knosys.2016.01.027
fatcat:lejxxae63jcutfx2ncahownt7e
BEAM or MFA I inspired Nv Neurons using opamps for line and line based polygon detection
2020
Zenodo
Abstract: Tensor network topologies for function, first-class, or MFA I as BEAM circuits are described within the framework of complexity theory using Lie Computability definitions. ...
We use line decomposition with circuitry to compose the lines into polyhedral shapes.(Bheemaiah, n.d.) ...
We have thus proven the generalized tensor formulation for a range of network topologies ranging from analog BEAM function driven architectures to deep learning networks, all of which are defined in a ...
doi:10.5281/zenodo.3893770
fatcat:rkb2arntf5hllgwsnfcbqqfb4e
« Previous
Showing results 1 — 15 out of 2,594 results