SMS scnews item created by Munir Hiabu at Tue 4 Jun 2019 1519
Type: Seminar
Distribution: World
Expiry: 21 Jun 2019
Calendar1: 21 Jun 2019 1400-1500
CalLoc1: Carslaw 373
CalTitle1: Low-rank tensor decompositions for sampling of high-dimensional probability distributions
Auth: munir@pmunir.pc (assumed)

# Low-rank tensor decompositions for sampling of high-dimensional probability distributions

### Friday June 21, 2pm, Carslaw 373

Sergey Dolgov ( University of Bath, Department of Mathematical Sciences)

Title: Low-rank tensor decompositions for sampling of high-dimensional probability distributions

Uncertainty quantification and inverse problems in many variables are pressingly needed tasks, yet high-dimensional functions are notoriously difficult to integrate in order to compute desired quantities of interest. Functional approximations, in particular the low-rank separation of variables into tensor product decompositions, have become popular for reducing the computational cost of high-dimensional integration down to linear scaling in the number of variables. However, tensor approximations may be inefficient for non-smooth functions. Sampling based Monte Carlo methods are more general, but they may exhibit a very slow convergence, overlooking a hidden structure of the function. In this talk we review tensor product approximations for the problem of uncertainty quantification and Bayesian inference. This allows efficient integration of smooth PDE solutions, posterior density functions and quantities of interest. Moreover, we can use the low-rank approximation of the density function to construct efficient proposals in the MCMC algorithm, as well as in the importance weighting. This combined tensor approximation - MCMC method is more accurate also if the quantity of interest is not smooth, such as the indicator function of an event.

Actions: