Sparse Tensor Product Approximation for a Class of Generalized Method of Moments Estimators

Michael Griebel (INS, Bonn)

Apr 04. 2022, 10:15 — 11:10

Generalized Method of Moments (GMM) estimators in their various forms, including the popular Maximum Likelihood (ML) estimator, are frequently applied for the evaluation of complex econometric models with not analytically computable moment or likelihood functions. As the objective functions of GMM- and ML-estimators themselves constitute the approximation of an integral, more precisely of the expected value over the real world data space, the question arises whether the approximation of the moment function and the simulation of the entire objective function can be combined.
Motivated by the popular Probit and Mixed Logit models, we consider double integrals with a linking function which stems from the considered estimator, e.g. the logarithm for Maximum Likelihood, and apply a sparse tensor product quadrature to reduce the computational effort for the approximation of the combined integral. Given Hölder continuity of the linking function, we prove that this
approach can improve the order of the convergence rate of the classical GMM- and ML-estimator by a factor of two, even for integrands of low regularity or high dimensionality. This result is illustrated by numerical simulations of Mixed Logit and Multinomial Probit integrals which are estimated by ML- and GMM-estimators, respectively.

Further Information
Venue:
ESI Boltzmann Lecture Hall
Recordings:
Recording
Associated Event:
Adaptivity, High Dimensionality and Randomness (Workshop)
Organizer(s):
Carsten Carstensen (HU Berlin)
Albert Cohen (Sorbonne U, Paris)
Michael Feischl (TU Vienna)
Christoph Schwab (ETH Zurich)