A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Multi-fidelity data fusion through parameter space reduction with applications to automotive engineering
[article]
2022
arXiv
pre-print
Then we build a low-fidelity response surface based on such reduction, thus enabling multi-fidelity Gaussian process regression without the need of running new simulations with simplified physical models ...
In the context of Gaussian process regression we can exploit low-fidelity models to better capture the latent manifold thus improving the accuracy of the model. ...
multi-fidelity Gaussian process regression In this section we briefly present the nonlinear autoregressive multi-fidelity Gaussian process regression (NARGP) scheme proposed in [37] . ...
arXiv:2110.14396v2
fatcat:q7pioug5tvgezpw3ghwr6lf4pe
Multi-fidelity data fusion for the approximation of scalar functions with low intrinsic dimensionality through active subspaces
[article]
2020
arXiv
pre-print
Gaussian processes are employed for non-parametric regression in a Bayesian setting. ...
When the model's gradient information is provided, the presence of an active subspace can be exploited to design low-fidelity response surfaces and thus enable Gaussian process multi-fidelity regression ...
multi-fidelity Gaussian process regression We adopt the nonlinear autoregressive multi-fidelity Gaussian process regression (NARGP) scheme proposed in [26] . ...
arXiv:2010.08349v1
fatcat:gsez6wmkkrg6xgyxs42mthehdy
Gaussian process regression with multiple response variables
2015
Chemometrics and Intelligent Laboratory Systems
Gaussian process regression (GPR) is a Bayesian non-parametric technology that has gained extensive application in data-based modelling of various systems, including those of interest to chemometrics. ...
In the paper we propose a direct formulation of the covariance function for multi-response GPR, based on the idea that its covariance function is assumed to be the "nominal" uni-output covariance multiplied ...
Gaussian process regressions and the widely used partial least squares regression for multi-inputs and multi-outputs (PLS) are also performed, where the former is conducted for the two outputs independently ...
doi:10.1016/j.chemolab.2015.01.016
fatcat:dl6qwy4bzrelxfdfbfl5x5rvfa
Multivariate Gaussian and Student-t Process Regression for Multi-output Prediction
[article]
2019
arXiv
pre-print
Gaussian process model for vector-valued function has been shown to be useful for multi-output prediction. ...
In this paper, we propose a unified framework which is used not only to introduce a novel multivariate Student-t process regression model (MV-TPR) for multi-output prediction, but also to reformulate the ...
Therefore, the existing methods for multi-output Gaussian process regression cannot be applied to Student−t process regression. ...
arXiv:1703.04455v6
fatcat:b5dgnjkb2fdsjdj27cpwl4r5bm
Information Agents for Pervasive Sensor Networks
2008
2008 Sixth Annual IEEE International Conference on Pervasive Computing and Communications (PerCom)
Gaussian process to build a probabilistic model of the environmental parameters being measured by local sensors, and the correlations and delays that exist between them. ...
Our motivating scenario is the need to provide situational awareness support to first responders at the scene of a large scale incident, and we describe how we use an iterative formulation of a multi-output ...
Figure 3 : 3 Prediction and regression of tide height data for (a) independent and (b) multi-output Gaussian processes. ...
doi:10.1109/percom.2008.22
dblp:conf/percom/RogersORRJ08
fatcat:zc7oqkiz7fefnfgeljfll2muwu
Remarks on multivariate Gaussian Process
[article]
2021
arXiv
pre-print
a useful statistical learning method for multi-output prediction problems. ...
In this paper, we propose a precise definition of multivariate Gaussian processes based on Gaussian measures on vector-valued function spaces, and provide an existence proof. ...
Multivariate Gaussian process regression As a useful application, multi-output prediction using multivariate Gaussian process is a good example. ...
arXiv:2010.09830v3
fatcat:rremwhn4ibalnpq5dcgtsgoud4
Scalable Inference for Gaussian Process Models with Black-Box Likelihoods
2015
Neural Information Processing Systems
Experiments on small datasets for various problems including regression, classification, Log Gaussian Cox processes, and warped GPs show that our method can perform as well as the full method under high ...
We propose a sparse method for scalable automated variational inference (AVI) in a large class of models with Gaussian process (GP) priors, multiple latent functions, multiple outputs and non-linear likelihoods ...
Gaussian process regression networks on the SARCOS dataset. ...
dblp:conf/nips/DezfouliB15
fatcat:fhx5627irzdm7imeppau36lcry
A probabilistic data-driven model for planar pushing
[article]
2017
arXiv
pre-print
The learned models rely on a variation of Gaussian processes with input-dependent noise called Variational Heteroscedastic Gaussian processes (VHGP) that capture the mean and variance of a stochastic function ...
These algorithms are referred as heteroscedastic Gaussian processes (HGPs) and can regress both the mean of the process and its variance over the input space. ...
. · Model: The learned model is based on a family of Gaussian processes called Heteroscedastic Gaussian processes (HGPs), along with their state-of-the-art variational implementation [1] . ...
arXiv:1704.03033v2
fatcat:du3v2t5kzrajjj5wlooirpixmm
Benchmarking Regression Methods: A comparison with CGAN
[article]
2020
arXiv
pre-print
The standard approach to solve regression problems is to probabilistically model the output y as the sum of a mean function m(x) and a noise term z; it is also usual to take the noise to be a Gaussian. ...
Excellent solutions have been demonstrated mostly in image processing applications which involve large, continuous output spaces. ...
Design of scalable CGAN implementations suited for regression is another key direction. Designing a Bayesian version of CGAN [19] would give great gains on small training datasets. ...
arXiv:1905.12868v5
fatcat:l4jet3kihne3vetwqpzdvcqw7a
Multi-fidelity surrogate modeling for time-series outputs
[article]
2022
arXiv
pre-print
Using an experimental design of the low-and high-fidelity code levels, an original Gaussian process regression method is proposed. ...
The code output is expanded on a basis built from the experimental design. The first coefficients of the expansion of the code output are processed by a co-kriging approach. ...
Gaussian process regression for functional outputs. In this subsection, we address Gaussian process regression for a simple-fidelity code with time-series output. ...
arXiv:2109.11374v2
fatcat:jectsrsqcja43clq3l5q7ytiqi
Multivariate Gaussian and Student-t process regression for multi-output prediction
2019
Neural computing & applications (Print)
Gaussian process model for vector-valued function has been shown to be useful for multi-output prediction. ...
In this paper, we propose a unified framework which is used not only to introduce a novel multivariate Student-t process regression model (MV-TPR) for multi-output prediction, but also to reformulate the ...
Therefore, the existing methods for multi-output Gaussian process regression cannot be applied to Student-t process regression. ...
doi:10.1007/s00521-019-04687-8
fatcat:qg55jv7q7jf7vdmyromcf7tix4
Coregionalised Locomotion Envelopes - A Qualitative Approach
[article]
2018
arXiv
pre-print
In this short paper we introduce coregionalised locomotion envelopes - a method for multi-dimensional manifold regression, on human locomotion variates. ...
'Sharing of statistical strength' is a phrase often employed in machine learning and signal processing. ...
Coregionalised Gaussian processes The coregionalised regression model relies upon the use of vector-valued kernels, one of the most common approaches for this regression is the linear model of coregionalisation ...
arXiv:1803.04965v1
fatcat:ze5lxdyecbhcdndza7khy674t4
Dirichlet-based Gaussian Processes for Large-scale Calibrated Classification
[article]
2018
arXiv
pre-print
In this work, we investigate if and how Gaussian process regression directly applied to the classification labels can be used to tackle this question. ...
To this aim, we propose a novel approach based on interpreting the labels as the output of a Dirichlet distribution. ...
Remark: In the binary classification case, we still have to perform regression on two latent processes. ...
arXiv:1805.10915v1
fatcat:rl3mr53fp5cqppczvg4yuacnvu
A Multiway Gaussian Mixture Model based Adaptive Kernel Partial Least Squares Regression Approach for Inferential Quality Predictions of Batch Processes
2012
IFAC Proceedings Volumes
Soft sensor technique has become increasingly important to provide reliable on-line measurements, facilitate advanced process control and improve product quality in process industries. ...
Then the multiway Gaussian mixture model is estimated with multiple Gaussian clusters in the kernel space. ...
Finally, the concluding remarks are drawn in Section V. ...
doi:10.3182/20120710-4-sg-2026.00086
fatcat:s45fsgbthjborojobo5ebc7fhm
Aerodynamic probe calibration using Gaussian process regression
2020
Measurement science and technology
With the help of statistical methods, more precisely Gaussian process regressions, this similarity is exploited in order to use existing calibration data of different probes reducing the calibration time ...
The number of calibration points in the five-hole probe case is reduced by at least one order of magnitude with comparable reconstruction accuracy. ...
Acknowledgments The authors would like to thank the company Vectoflow GmbH for providing the multi-hole pressure probe calibration data of numerous different probes. ...
doi:10.1088/1361-6501/aba37d
fatcat:33wumttfovhtbef5t3kdoibxoa
« Previous
Showing results 1 — 15 out of 26,135 results