11 Hits in 2.2 sec

GP-VAE: Deep Probabilistic Time Series Imputation [article]

Vincent Fortuin, Dmitry Baranchuk, Gunnar Rätsch, Stephan Mandt
2020 arXiv   pre-print
Our modeling assumption is simple and interpretable: the high dimensional time series has a lower-dimensional representation which evolves smoothly in time according to a Gaussian process.  ...  Multivariate time series with missing values are common in areas such as healthcare and finance, and have grown in number and complexity over the years.  ...  Deep learning for time series imputation.  ... 
arXiv:1907.04155v5 fatcat:iew4h2v4k5erfafsri5zqidxtq

Medical data wrangling with sequential variational autoencoders [article]

Daniel Barrejón, Pablo M. Olmos, Antonio Artés-Rodríguez
2021 arXiv   pre-print
We show that Shi-VAE achieves the best performance in terms of using both metrics, with lower computational complexity than the GP-VAE model, which is the state-of-the-art method for medical records.  ...  These missing patterns are commonly assumed to be completely random, but in medical scenarios, the reality is that these patterns occur in bursts due to sensors that are off for some time or data collected  ...  The GP-VAE probabilistic model As discussed in the introduction, GP-VAE [12] stands out as the state-of-the-art VAE to handle temporal series.  ... 
arXiv:2103.07206v1 fatcat:runqnlaaqbgdvlpve2rieehsuq

CSDI: Conditional Score-based Diffusion Models for Probabilistic Time Series Imputation [article]

Yusuke Tashiro, Jiaming Song, Yang Song, Stefano Ermon
2021 arXiv   pre-print
The imputation of missing values in time series has many applications in healthcare and finance.  ...  Furthermore, CSDI can also be applied to time series interpolation and probabilistic forecasting, and is competitive with existing baselines.  ...  Related works Time series imputations with deep learning Previous studies have shown deep learning models can capture the temporal dependency of time series and give more accurate imputation than statistical  ... 
arXiv:2107.03502v2 fatcat:wqqegayrnbdvvpa4u6r4cj64v4

Sparse Gaussian Process Variational Autoencoders [article]

Matthew Ashman, Jonathan So, Will Tebbutt, Vincent Fortuin, Michael Pearce, Richard E. Turner
2020 arXiv   pre-print
An effective framework for handling such data are Gaussian process deep generative models (GP-DGMs), which employ GP priors over the latent variables of DGMs.  ...  We consider independent GPs modelling the seven point time series for each variable and each station, with model parameters shared across groups.  ...  Deep Gaussian Processes Single hidden layer deep GPs (DGPs) (Damianou & Lawrence, 2013) are characterised by the use of a GP likelihood function, giving rise to the probabilistic model f ∼ K k=1 GP 0  ... 
arXiv:2010.10177v2 fatcat:4slldmmt25gp3mz4gmeevofeq4

Longitudinal Variational Autoencoder [article]

Siddharth Ramchandran, Gleb Tikhonov, Kalle Kujanpää, Miika Koskinen, Harri Lähdesmäki
2021 arXiv   pre-print
Longitudinal datasets measured repeatedly over time from individual subjects, arise in many biomedical, psychological, social, and other studies.  ...  We compare our model against previous methods on synthetic as well as clinical datasets, and demonstrate the state-of-the-art performance in data imputation, reconstruction, and long-term prediction tasks  ...  GP-VAE: deep probabilistic time series imputation. In The 23rd International Conference on Artificial Intelligence and Statistics, AISTATS. PMLR, 2020. C. L. Giles, G. M. Kuhn, and R. J.  ... 
arXiv:2006.09763v3 fatcat:gi3tjlvxnzesjc7jt4obnwc2uq

Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections [article]

Csaba Toth, Patric Bonnier, Harald Oberhauser
2021 arXiv   pre-print
Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies.  ...  This yields modular and scalable building blocks for neural networks that give state-of-the-art performance on standard benchmarks such as multivariate time series classification and generative models  ...  The GP-VAE model. In this experiment, we take as base model the recent GP-VAE (Fortuin et al., 2020) , that provides state-of-the-art results for probabilistic sequential data imputation.  ... 
arXiv:2006.07027v2 fatcat:sbthoxojlzcn5anniqt5spfop4

Missing Value Imputation on Multidimensional Time Series [article]

Parikshit Bansal, Prathamesh Deshpande, Sunita Sarawagi
2021 arXiv   pre-print
We present DeepMVI, a deep learning method for missing value imputation in multidimensional time-series datasets.  ...  DeepMVI uses a neural network to combine fine-grained and coarse-grained patterns along a time series, and trends from related series across categorical dimensions.  ...  On record time series datasets GP-VAE has been shown to be worse empirically than BRITS, and seems to be geared towards image datasets.  ... 
arXiv:2103.01600v2 fatcat:bhojfu55ujev3kl46qziowjatu

A Variational Autoencoder for Heterogeneous Temporal and Longitudinal Data [article]

Mine Öğretir, Siddharth Ramchandran, Dimitrios Papatheodorou, Harri Lähdesmäki
2022 arXiv   pre-print
The variational autoencoder (VAE) is a popular deep latent variable model used to analyse high-dimensional datasets by learning a low-dimensional latent representation of the data.  ...  We demonstrate our model's efficacy through simulated as well as clinical datasets, and show that our proposed model achieves competitive performance in missing value imputation and predictive accuracy  ...  [4] proposed the GP-VAE model that assumes an independent GP prior on each subject's time series.  ... 
arXiv:2204.09369v1 fatcat:ohoi4hecanf5zjjiptuexgwgb4

Spatiotemporal Tensor Completion for Improved Urban Traffic Imputation

Ahmed Ben Said, Abdelkarim Erradi
2021 IEEE transactions on intelligent transportation systems (Print)  
To mine the temporal aspect, we first conduct an entropy analysis to determine the most regular time-series.  ...  In the second scenario, we simulate missing values at a given area and time duration.  ...  A decoder, trained to reconstruct the traffic samples from the latent space, can then be used to generate the imputed samples. In [26] , GP-VAE, a novel VAE-based technique is proposed.  ... 
doi:10.1109/tits.2021.3062999 fatcat:kl7r4ynhercfhgonu5ao6x7xt4

How Faithful is your Synthetic Data? Sample-level Metrics for Evaluating and Auditing Generative Models [article]

Ahmed M. Alaa, Boris van Breugel, Evgeny Saveliev, Mihaela van der Schaar
2022 arXiv   pre-print
The three metric components correspond to (interpretable) probabilistic quantities, and are estimated via sample-level binary classification.  ...  The data synthesized by ADS-GAN (×) displayed the best performance, followed by WGAN-GP (•), VAE ( ), and GAN ( ).  ...  (a) Here, we rank the 4 generative models (ADS-GAN: ×, WGAN-GP: •, VAE: , GAN: ) with respect to each evaluation metric (leftmost is best).  ... 
arXiv:2102.08921v2 fatcat:fbmndttskfb3rlvg7yepggi3we

DeepHealth: Review and challenges of artificial intelligence in health informatics [article]

Gloria Hyunjung Kwak, Pan Hui
2020 arXiv   pre-print
Despite its notable advantages, there are some key challenges on data (high dimensionality, heterogeneity, time dependency, sparsity, irregularity, lack of label, bias) and model (reliability, interpretability  ...  RNN is used not only for time-series predictive modeling but also for missing data imputation.  ...  After balancing cohorts' chart presence level and imputing missing values with gaussian process variational autoencoders (GP-VAE), they used the Bi-LSTM in conjunction with the attention mechanism to predict  ... 
arXiv:1909.00384v2 fatcat:sy7pm2c2uvdd3pal2russn4xri