V&V Webinar Series

V&V Webinar Series

Verification and Validation of Computational Modeling of Materials and Structures

Live Events: October 13, 14, 20, and 22, 2020

View Series
This webinar is free for members and non-members.

The webinar series explores the latest TMS science and technology accelerator study, “Accelerating the Broad Implementation of Verification & Validation in Computational Models of the Mechanics of Materials and Structures,” published in October 2020. The report, which is funded by the National Science Foundation, recognizes the value and tremendous potential of adopting V&V practices, along with the related practice of uncertainty quantification (UQ), to deliver efficiencies and advancements to product development and manufacturing procedures.

Learning Objectives

  • Discuss why V&V is a critical step in modeling and simulation
  • Summarize and provide examples of code and solution verification
  • Highlight the importance of experiments and experimentalists in the V&V process
  • Summarize validation and the critical role of uncertainty quantification

During four sessions on October 13, 15, 20, & 22, experts along with moderators will go over how to deliver efficiencies and advancements to product development and manufacturing procedures. This webinar series introduces the importance of verification and validation (V&V) in modeling and simulation and summarizes the six V&V steps: 1) Code verification; 2) Preliminary calculations and design of validation experiments; 3) Solution/calculation verification; 4) Uncertainty quantification of simulations; 5) Uncertainty quantification of experiments; and 6) Validation. Participants will gain an overview of V&V through presentations by experts using real-world examples.

Registration

  Cost
Member Free
Nonmember $100 Free

Session 1 - Tuesday, October 13, 2020, 1 pm to 2 pm ET

Moderator

George Spanos, TMS

Presenter

Michael Tonks

Michael Tonks, University of Florida
"Making materials modeling and simulation less wrong and more useful: An introduction to verification and validation"

Predictive computational models can enable large time and cost savings for the development of new products and materials. However, these benefits are only possible when the models are accurate. As said by statistician George E. P. Box, “All models are wrong, but some are useful.” In this webinar series, we will discuss how verification and validation (V&V) can be used to make our models and simulation tools less wrong and more useful. In this introduction lecture, I will introduce the basic concepts of V&V and why it is important to apply them to modeling of materials. I will also introduce the new TMS report: "Accelerating the Broad Implementation of Verification and Validation in Computational Models of the Mechanics of Materials and Structures." Finally, I will outline the other talks that will be given as part of this webinar series.

Bio
Michael Tonks is an associate professor of materials science and engineering at the University of Florida (UF). Prior to joining UF in Fall 2017, he was an assistant professor of nuclear engineering at the Pennsylvania State University for two years and a staff scientist in the Fuels Modeling and Simulation Department at Idaho National Laboratory for six years. Tonks was the original creator of the mesoscale fuel performance tool MARMOT and led its development for five years. He helped to pioneer the approach taken in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program to use multiscale modeling and simulation (both atomistic and mesoscale) to inform the development of materials models for the BISON tool that are based on microstructure rather than burn-up, and he won the NEAMS Excellence Award for that work in 2014 and the Presidential Early Career Award for Scientists and Engineers in 2017. His research uses mesoscale modeling and simulation coupled with experimental data to investigate the impact of irradiation induced microstructure evolution on material performance. He is also investigating and applying advanced methods for verification and validation with statistical uncertainty quantification.

Session 2 - Thursday, October 15, 2020, 1 pm to 2 pm ET

Moderator

George Spanos, TMS

Presenters

Michael Tonks

Michael Tonks, University of Florida
"Code Verification: The foundation of the verification and validation process"

Computational codes are numerical implementations of materials models that can be used to simulate a variety of applications. However, implementation errors (bugs) and numerical error decrease the accuracy of the code. In code verification, we determine whether the computer code accurately represents an underlining model; i.e., whether the code solves the modeling equations correctly. It is the first task in any verification and validation (V&V) process and lays the foundation for the rest of V&V. In this webinar, I will describe why code verification is essential and I will demonstrate its basic elements. I will discuss and demonstrate verification against known analytical solutions and using the Method of Manufactured Solutions.

Brandon Wilson

Brandon Wilson, Los Alamos National Lab
"Solution Verification: Quantifying the Numerical Uncertainty of a Simulation"

Numerical errors are inherent to numerical models and simulations. For numerical solutions of a discretized set of partial differential equations (PDEs), common sources of numerical error include discretization error, round-off errors, and iterative convergence errors. However, as numerical implementations become more complex (e.g. adaptive mesh refinement, sub-grid models, particle-based models, and coupled physics models), additional sources of numerical error will affect the solution.

Solution verification is the process of quantifying the numerical uncertainties and errors that exist for a particular implementation of a numerical model. In this talk, I will motivate the need for solution verification and summarize existing methods to quantify common numerical uncertainties for standard numerical solutions of discretized PDEs (e.g., finite element analysis, computational fluid dynamics). I will then address alternate numerical errors relevant to more complex numerical models and highlight methods to characterize these.

Bio
Brandon received his Ph.D. in Mechanical Engineering from Utah State University. He joined Los Alamos National Laboratory in 2012 as a postdoctoral researcher in experimental fluid dynamics and now contributes to several VVUQ projects as a scientist and acting group lead in LANL's VVUQ program. During his career at LANL, Brandon has acquired a diverse background, having been part of experimental (P), engineering (W), and computational physics (XCP) organizations. He has been involved with Richtmyer-Meshkov measurements on the vertical shock tube, engineering analysis, development of the high-energy-density physics validation suite, and uncertainty quantification (UQ) of materials aging/self-irradiation effects. Brandon has brought his broad experience to bear on the development and application of UQ methods for experimental facilities, diagnostics, engineering systems, and complex physical processes. His research interests include development of validation, UQ, and multi-fidelity modeling methods.

Session 3 - Tuesday, October 20, 2020, 1 pm to 2 pm ET

Moderator

Michael Tonks, University of Florida

Presenters

Jacob Hochhalter

Jacob Hochhalter, University of Utah
"Designing validation experiments: combining modeler and experimentalist perspectives"

This session will present a discussion of the preliminary calculations and design of validation experiments aspects of the Verification & Validation in Computational Modeling of Materials and Structures study. This experiment-design stage comes early in the V&V process and requires a close collaboration between modelers and experimentalists. The primary objective in this stage is to design an experiment wherein the quantities of interest (QoIs) to serve as a validation basis are defined. More specifically, the modeler must specify those QoIs along with all model-prediction assumptions and physical limitations to their validity. Together with the modeler, the experimentalist is then tasked with designing an experiment in which those QoIs can be reliably measured and compared directly with the model predictions. The discussion of the main concepts are complemented with a few examples from practice to provide context.

Bio
Jacob Hochhalter joined the University of Utah as an assistant professor of mechanical engineering in 2018, where he initiated the Materials Prognosis from Integrated Modeling & Experiment (M’) Lab to study emergent structural and material prognosis issues that involve the multiscale and stochastic nature of plasticity and fatigue cracking in structural materials. At the same time, he joined AS&M as a senior engineer. In that role, he consults for the NASA Engineering and Safety Center (NESC) Materials Technical Discipline Team on material microstructure, fatigue, and fracture related assessments. Prior to that transition, from 2009-2018, he served as a Civil Servant Materials Research Engineer and technical lead for the Damage Science Group at NASA Langley Research Center. He earned a PhD from Cornell University in 2010 where he studied microscale fatigue crack initiation mechanisms as a NASA Graduate Student Research Fellow. Hochhalter has received the NASA Group Achievement Award for work on environmentally-assisted cracking, the NESC Engineering Excellence Award for the development of innovative test and analysis techniques for the certification of fracture-critical components, and the NASA Early Career Achievement Medal. He is also an active member of TMS, where he serves on the Mechanical Behavior of Materials, Additive Manufacturing, and ICME Committees.

Brandon Wilson

Brandon Wilson, Los Alamos National Lab
"Characterizing Reality in the Presence of Experimental and Validation Uncertainty"

Validation assessments rely on experiments as a high-fidelity reference to reality. Although “experimental measurements are the most faithful reflections of reality1”, it must be recognized that all experimental measurements are uncertain. That uncertainty is inherited by a validation assessment and must be characterized. The quality of a validation assessment is only as good as the quality of the experiment and experimental uncertainty quantification. This talk outlines best practices for designing validation experiments2,3,4. The intention of these best practices is to 1) reduce errors and uncertainties in the experiment, 2) quantify uncertainties in measured output, 3) characterize uncertain conditions used as inputs to the validation simulation, and 4) determine correlated uncertainties between the output, facility, diagnostics, and experimental conditions.

1 Oberkampf, W. L., Trucano, T. G., & Hirsch, C. (2004). Verification, validation, and predictive capability in computational engineering and physics. Appl. Mech. Rev., 57(5), 345-384. https://doi.org/10.1115/1.1767847
2 Oberkampf, W. L., & Roy, C. J. (2010). Verification and validation in scientific computing. Cambridge University Press.
3 Oberkampf, W. L., & Smith, B. (2014). Assessment criteria for computational fluid dynamics validation benchmark experiments. In 52nd Aerospace Sciences Meeting (p. 0205).
4 Wilson, B.M., & Meadors, G.D., & Koskelo, A.K., Kline, J. (2021). Best practices for using high-energy-density physics experiments in validation. In Preparation for Phys. Plasmas.

Session 4 - Thursday, October 22, 2020, 12 pm to 1 pm ET

Moderator

Michael Tonks, University of Florida

Presenters

Wei Chen

Wei Chen, Northwestern University
"Model Calibration and Uncertainty Quantification in Integrated Computational Materials Engineering"

In this talk, we will introduce a stochastic model calibration and uncertainty quantification framework that simultaneously identify the model parameter uncertainty and the method uncertainty through concurrent calibration and bias correction. For applying this approach to multiscale simulations in the Integrated Computational Materials Engineering (ICME), we will introduce a physics-informed modular Bayesian approach where the lack of experimental and simulation resources is addressed by enforcing certain physical constraints on the functional forms of the simulator and its potential discrepancies in the Bayesian analyses. The developed approach can estimate the calibration parameters of a constitutive law while considering its potential discrepancies with the true physical system. The corrected constitutive law is subsequently used to model the multiscale preforming process in manufacturing carbon composites. The same model validation and uncertainty quantification approach can be applied to ICME of other multiscale materials systems.

Bio
Wei Chen is the Wilson-Cook Professor in Engineering Design and Chair of Department of Mechanical Engineering at Northwestern University. Directing the Integrated DEsign Automation Laboratory (IDEAL- http://ideal.mech.northwestern.edu/), her current research involves issues such as simulation-based design under uncertainty; model validation and uncertainty quantification; data science in design and advanced manufacturing; stochastic multiscale analysis and materials design; design of metamaterials; multidisciplinary design optimization; consumer choice modeling and decision-based design. She is a member of the National Academy of Engineering (NAE) and currently serving as the Editor-in-chief of the ASME Journal of Mechanical Design and the President of the International Society of Structural and Multidisciplinary Design (ISSMO). Chen was the recipient of the 2015 ASME Design Automation Award and 1996 NSF Faculty Career Award. She received her Ph.D. from the Georgia Institute of Technology in 1995.

Michael Groeber

Michael Groeber, The Ohio State University
"A Survey of Validation Exercises for Additive Manufacturing Models"

The presentation will overview a series of validation experiments used to assess additive manufacturing models. The presentation will discuss choices made for experiment design, data collection and processing, methods for comparing experiments and simulation. Examples will cover process-to-structure predictions and structure-to-property predictions, both at ‘macro’ and ‘micro’ scales.

Bio
Dr. Michael A. Groeber is the Faculty Director of the Artificially Intelligent Manufacturing Systems (AIMS) Lab and an Associate Professor of Integrated Systems Engineering and Mechanical and Aerospace Engineering at The Ohio State University. Dr. Groeber’s research projects focus on the quantification/representation of microstructure for improving process and property modeling. Dr. Groeber’s recent work aims to develop understanding of local processing conditions and their effects on material structure in metallic AM. Dr. Groeber’s passion and vision is in the integration of data analytics and optimization with manufacturing processes to advance the U.S. manufacturing community. Dr. Groeber is also a principal developer/inventor of SIMPL and DREAM.3D, a unique software library and application package that integrates a flexible, hierarchical data structure with numerous digital microstructure, image processing and data analytics tools to facilitate the advancement of Integrated Computational Materials Engineering (ICME). Dr. Groeber has also developed autonomous, multi-modal data collection systems that integrate real-time analysis into the collection process. Prior to joining OSU, Dr. Groeber worked for10 years in the metals processing and characterization, sensing, and analytics research groups within the Air Force Research Laboratory. Dr. Groeber received his B.S. and Ph.D. degrees in Materials Science and Engineering from The Ohio State University in 2003 and 2007, respectively.