We first present a framework for solving and learning arbitrary nonlinear PDEs with Gaussian Processes (GPs). The method comes with convergence guarantees and inherits the complexity of state-of-the-art solvers for dense kernel matrices. This first part is a joint work with Yifan Chen, Bamdad Hosseini, and Andrew Stuart. We then present an encompassing GP framework for completing arbitrary computational graphs. This general framework is motivated by the observation that most problems in Computational Sciences and Engineering (CSE) can be described as that of completing (from data) a computational graph representing dependencies between functions and variables. Functions and variables may be known, unknown, or random. Data comes in the form of observations of distinct values of a finite number of subsets of the variables of the graph. The underlying problem combines a regression problem (approximating unknown functions) with a matrix completion problem (recovering unobserved variables in the data) and can be seen as that of solving nonlinear systems of equations in reproducing kernel Hilbert spaces from limited data and incomplete information.