We will be concerned with two application areas where low-rank tensor formats can be used beneficially. First, sampling from high-dimensional probability measures is a common task in fields such as Uncertainty Quantification (UQ) and Generative Modelling (GM). In GM in particular, the use of reverse-time diffusion processes depending on the log-densities of Ornstein-Uhlenbeck forward processes have become popular. It has recently been shown that such log-densities can be obtained by solving a Hamilton-Jacobi-Bellman (HJB) equation known from stochastic optimal control. We propose a direct time integration with functional Tensor Trains (TT) for the spatial discretisation. Crucially, this method is sample-free, agnostic to normalisation constants and can alleviate the curse of dimensionality.
Second, we will look at simulating variational quantum circuits, which is key to benchmarking and optimizing of near-term quantum devices. We propose a tensor network method that addresses limitations of the widely used TEBD (time evolving block decimation) algorithm, which suffers from cumulative truncation errors and inefficiency in simulating long-range gates by requiring SWAP insertions. We propose an evolution of matrix product states (MPS) via a time-dependent variational principle (TDVP), projecting the generator of each gate onto the MPS tangent space and enabling more reliable large-scale circuit simulations.