Motivated by generative AI inspired modeling, we investigate the mathematical foundations of neural and signature stochastic differential equations (SDEs). In these models, the coefficients of the SDE are parameterized either by neural networks or by functions of the path signature, leveraging their well-known universal approximation properties. While such classical universal approximation results concern static functions, our focus is on dynamic universal approximation at the level of the SDE solutions. We show that any standard SDE as well as path-dependent SDE can be approximated by a suitable neural or signature SDE, respectively. For signature SDEs, this requires the development of a dedicated well-posedness theory. Building on this foundation, we prove in particular that SDEs whose coefficients are entire functions of the signature act as dynamic universal approximators for path-dependent SDEs. Since these signature SDEs exhibit an affine structure with respect to the lifted signature state, this result implies a universality property of affine processes within the class of Itô-processes. As applications we present a signature-based asset price model calibrated jointly to VIX and SPX options.
The talk is based on joint works with Tomas Carrondo, Eva Flonner, Guido Gazzani, Paul Hager, Kurt Kevin, Janka Möller and Sara Svaluto-Ferro.