We discuss novel multilevel best linear unbiased estimators (BLUEs) introduced in . The goal is the estimation of the expectation of a scalar-valued quantity of interest associated with a family of model evaluations. The key idea of the multilevel BLUEs is to reformulate
the estimation as a generalized linear regression problem. By construction, BLUEs are variance minimal within the class of linear unbiased estimators. By solving a sample allocation problem we further construct a variance minimal, linear, and unbiased estimator for a given computational budget. We compare our proposed estimator to other multilevel estimators such as multilevel Monte Carlo , multifidelity Monte Carlo , and approximate control variates . In addition, we show that our estimator approaches a sharp lower bound that holds for any linear unbiased multilevel estimator in the infinite low-fidelity data limit. Finally, we specialize the results in  to PDE-based models which are parameterized by a discretization quantity, e.g. the finite element mesh size. We prove that in this case the complexity of the BLUE is not worse than the complexity of multilevel Monte Carlo . In practise, we observe a complexity reduction for selected random elliptic PDE problems.
 M. B. Giles, Multilevel Monte Carlo methods, Acta Numerica, 24, pp. 259-328, 2015.
 A. A. Gorodetsky, G. Geraci, M. Eldred, J. D. Jakeman, A Generalized Framework for Approximate Control Variates, J. Comput. Phys., 408, pp. 109257, 2020.
 B. Peherstorfer, K. Willcox, M. Gunzburger, Optimal Model Management for Multifidelity Monte Carlo Estimation, SIAM J. Sci. Comput., 38, pp. A3163-A3194, 2016.
 D. Schaden, E. Ullmann, On multilevel best linear unbiased estimators. To appear in SIAM/ASA J. Uncert. Quantif.
 D. Schaden, E. Ullmann, Asymptotic analysis of multilevel best linear unbiased estimators. Submitted.