We review applications and algorithmic challenges of Gaussian Process (GP) modeling. GP is a powerful and flexible uncertainty quantification and data analysis technique that enables the construction of complex models without the need to specify algebraic relationships between variables. This is done by working directly in the space of the kernel or covariance matrix. In addition, it derives from a Bayesian framework and, as such, it naturally provides predictive probability distributions. We describe how these features can be exploited in Measurement and Verification (M&V) tasks and in the construction of building surrogate models from detailed physical counterparts. Training a GP model, however, presents a highly complex computational pattern. This requires the maximization of a general likelihood function with respect to the kernel hyperparameters. In addition, the kernel matrix appears in inverse form so even a single function evaluation has cubic complexity with the number of training points. We discuss the technical difficulties arising from this complex pattern and present some strategies available to date to deal with them.