Kevin Carlberg, Sandia National Labs
Date and Time: Oct 11, 2017 (12:30 PM)
Location: Orchard room (3280) at the Wisconsin Institute for Discovery Building
.As physics-based simulation has played an increasingly important role in science and engineering, greater demands are being placed on model fidelity. This high fidelity necessitates fine spatiotemporal resolution, which can lead to extreme-scale models whose simulations consume months on thousands of computing cores. Further, most practical decision-making scenarios (e.g., uncertainty quantification, design optimization) are ‘many query’ in nature, as they require the (parameterized) model to be simulated thousands of times. This leads to a ‘computational barrier’: the high cost of extreme-scale simulations renders them impractical for many-query problems. In this talk, I will present several approaches that exploit simulation data to overcome this barrier.
First, I will introduce nonlinear model reduction methods that employ spatial simulation data and subspace projection to reduce the dimensionality of nonlinear dynamical-system models while preserving critical dynamical-system properties such as discrete-time optimality, global conservation, and Lagrangian structure.
Second, I will describe methods for data-driven error modeling, which apply regression methods from machine learning to construct an accurate, low-variance statistical model of the error incurred by model reduction. This quantifies the (epistemic) uncertainty introduced by reduced-order models and enables them to be rigorously integrated in uncertainty-quantification applications.
Finally, I will present data-driven numerical solvers that use simulation data to improve the performance of linear/nonlinear solvers and time-integration methods. I will present one such approach: an adaptive-discretization method that applies Krylov-subspace iteration or h-adaptivity to enrich an initial solution subspace extracted from spatial simulation data.