Characterising intractable high-dimensional random variables is one of the fundamental challenges in stochastic computation. It has broad applications in statistical physics, machine learning, uncertainty quantification, econometrics, and beyond. The recent surge of transport maps offers a mathematical foundation and new insights for tackling this challenge. In this talk, we will present a tensor-train (TT) based order-preserving construction of inverse Rosenblatt transport in high dimensions. It characterises intractable random variables via couplings with tractable reference random variables. By integrating our TT-based approach into a nested approximation framework inspired by deep neural networks, we are able to significantly expand its capability to random variables with complicated nonlinear interactions and concentrated density functions. We demonstrate the efficacy of the resulting deep inverse Rosenblatt transport (DIRT) on a range of applications in statistical learning and uncertainty quantification, including parameter estimation for dynamical systems, PDE-constrained inverse problems, and Bayesian filtering.