It is well known that global solutions to the energy-critical nonlinear Schrödinger equation scatter, and hence approach the linear evolution asymptotically. In this talk, we show that solutions further parallel the linear evolution by exhibiting dispersive decay pointwise in time. Previous work in this direction required solutions to have high regularity and did not demonstrate a linear dependence on the initial data. We resolve these issues by using a Lorentz improvement of Strichartz inequalities and finding global Lorentz spacetime bounds. This allows us to show dispersive decay for solutions in the (scaling-critical) energy space and recover linear dependence on the initial data.