Keywords: gaussian process, process control, optimization
Statistical modeling is the art of combining mathematical/probabilistic models and data to infer about some real-life system. The structure, volume and diversity of modern data sources brings out a number of computational and modeling challenges in applying statistical approaches to such data.
This talk will focus on a recent endeavor with a corporate partner in leveraging a variety of automatically collected data sources to better manage a supply chain of a large industrial corporation. Our eventual approach borrows concepts from operations research, discrete event simulation, and Bayesian computer model calibration. This work will be compared to past experience in combining physics-based computational models with experimental data in order to carry out statistical inference for a very different looking problem.