Estimation of multivariate regression functions by overparametrized deep neural network from i.i.d data is considered. The estimates are computed by applying backpropagation to a neural network consisting of a large number of fully connected neural networks, which are computed in parallel. The estimates are overparametrized in the sense that their number of weights is much larger than the number of data points. It is shown that with a suitable initialization of the weights and a sufficient large number of gradient descent steps these estimates can interpolate the training data, and in this case they do not generalize well to new independent data since they do not achieve the optimale Minimax rate of convergence in case of a smooth regression function. And it is shown that with a different initialization, much less gradient descent steps and a suitable L_2 regularization they generalize well to new independent data since they are universally consistent.