Extracting the full scientific potential of the weak lensing data from Euclid and the Vera Rubin Observatory will require analysis methods that are both highly precise and maximally informative. The standard approach, based on two-point statistics, is effective but sub-optimal: it discards information in the data and relies on simplifying assumptions in the physical and statistical modelling. In this talk, I will present a new approach for weak lensing analysis that employs a full physics model and field-level statistics. By analysing lensing maps pixel by pixel, this method makes optimal use of the data and improves the cosmological constraints by up to a factor of five, providing also a digital copy of the Universe inferred from the observations. I will discuss the current status and ways to meet the challenges of this approach for its first real data application.