The course will include six, half-day, virtual modules, with supporting materials. Registrants receive access to all course materials and access to the TMS science and technology accelerator study report, Accelerating the Broad Implementation of Verification & Validation in Computational Models of the Mechanics of Materials and Structures. Recordings and materials remain available to registrants online until September 26, 2022.
The schedule is in Eastern Daylight Time (UTC-4:00). Use the Time Zone Converter to translate event times into your local time zone.
Mon., Aug. 22, 9 a.m.–Noon
Michael Tonks (lead)
Mon., Aug. 22, 1 p.m.–4 p.m.
Michael Tonks (lead)
Tues., Aug. 23, 9 a.m.–Noon
Sankaran Mahadevan (lead)
Tues., Aug. 23, 1 p.m.–4 p.m.
Brandon M. Wilson (lead)
Wed., Aug. 24, 9 a.m.–Noon
Jacob Hochhalter (lead), Anthony D. Rollett, Zachary Harris
Wed., Aug. 24, 1 p.m.–4 p.m.
David Moorcroft (lead), Kenneth Aycock, Joshua Kaizer
Go to the Instructors section to view bios and learn more about their research and professional experience.
Module 1: How to Design and Implement Robust Verification and Validation Practices
Instructor: Michael Tonks
This module will instruct attendees on why V&V is critical for computational modeling of materials and structures and how to design and implement V&V practices. We will discuss the six process steps in any V&V approach, as outlined in the TMS V&V study report. We will also work through the process of planning the six process steps for materials applications.
- Understand why a robust V&V approach is essential for computational modeling of materials and structures
- Learn the basics of a V&V approach, including the six V&V process steps and how they interact
- Gain practical experience in planning the six V&V process steps
Format: Live instruction. Breakout sessions for a hands-on planning of the six process steps. Survey.
Module 2: Code and Solution Verification
Instructors: Michael Tonks
This module will instruct attendees on how to carry out code and solution verification. We will begin with how to carry out code verification by comparing to analytical and manufactured solutions. We will then learn how to carry out solution verification to estimate the discretization error in a simulation that will be validated.
- Learn the theory behind code verification, including the method of manufactured solutions
- Gain practical experience carrying out code verification
- Learn the theory behind solution verification
- Gain practical experience carrying out solution verification
Format: Live Instruction and hands-on activities.
Module 3: Computational Uncertainty Quantification
Instructor: Sankaran Mahadevan
This module will cover the quantification of uncertainty in computational models. Sources of model uncertainty can be considered in three groups:
- Model form assumptions and simplifications that arise in deriving the mathematical model form;
- Uncertainty regarding model parameters, which are typically unmeasurable and are inferred from measured quantities; and
- Solution approximation errors that arise in numerical solutions of the model equations, such as finite element discretization, reduced-order models, etc.
The module will discuss computational techniques to quantify the above sources of uncertainties, and relate these techniques to model verification, validation and calibration. Next, the module will discuss methods to aggregate and propagate the different uncertainty sources through the computational model in order to quantify the uncertainty in the model prediction. The module will also discuss uncertainty sensitivity analysis, i.e., how to quantify the relative contributions of different uncertainty sources towards the overall uncertainty in the model output. Several engineering examples will be used to illustrate the uncertainty quantification techniques.
- Learn about the sources of uncertainty in computational models.
- Learn about uncertainty quantification methods for different uncertainty sources in computational models.
- Learn about aggregating the uncertainty resulting from multiple sources in computational models.
Format: Live instruction
Module 4: Quantifying Experimental Uncertainties for Validation Assessments
Instructor: Brandon M. Wilson
This module will cover the quantification of uncertainty in experimental data. Sources of experimental measurement uncertainty relate to:
- Physical differences among nominally identical members of a manufactured ensemble;
- Non-repeatability of measurements that arises from changes in temperature, pressure, humidity, bolt-preload, boundary conditions, initial conditions, and nonlinearities in system behavior;
- Sensor imperfections and miscalibrations that yield noisy outputs;
- Data limitations that lead to variability and uncertainty in system statistics;
- Failure to measure all factors that influence a behavior;
- Errors and poor parameter choices in data analysis.
The module will provide examples on how to quantify the contributions of different factors towards the overall uncertainty in experimental data.
- Learn about sources of uncertainty in experimental data.
- Learn about uncertainty quantification methods for different uncertainty sources and measures of system behavior.
- Learn about aggregating the uncertainty resulting from multiple sources in experiments.
Format: Live instruction
Module 5: Designing Validation Experiments: Combining Modeler and Experimentalist Perspectives
Instructors: Jacob Hochhalter, Anthony D. Rollett, Zachary Harris
This module will present a discussion of the preliminary calculations and design of validation experiments aspects of the Verification & Validation in Computational Modeling of Materials and Structures study. This experiment-design stage comes early in the VV&UQ process and requires a close collaboration between modelers and experimentalists. The primary objective in this stage is to design an experiment wherein the quantities of interest (QoIs) to serve as a validation basis are defined. The discussion of the main concepts are complemented with a few examples from several current in-practice cases to provide context. The module will conclude with a presentation of transitioning from validation to materials qualification.
- Understand the difference between verification, calibration and validation and their importance and place in the overall VV&UQ process.
- Understand how to scope the objectives of a validation task to address the desired application.
- Practice hands-on experiment design using open-source computational tools.
Format: Live instruction and hands-on activity. The hands-on activities will include running open source computation tools (no cost) from provided step-by-step instructions.
Module 6: Regulatory Agency Perspectives, Examples and Lessons Learned
Instructors: David Moorcroft, Kenneth Aycock, Joshua Kaizer
This module will give the perspective of the importance and utility of VVUQ and provide examples in certain commercial sectors from the point of view of three different regulatory agencies: the Federal Aviation Administration (FAA), the U.S. Nuclear Regulatory Commission (NRC), and the US Food and Drug Administration (FDA). The instructors will discuss their experiences reviewing modeling and simulation, regulations regarding modeling and simulation, and approaches commonly used to assess the credibility of simulations.
- Learn the importance of credible simulation data for use in regulatory decision making.
- Learn how VVUQ supports regulatory decision making when submissions include computational models.
- Participate in an open discussion on credibility, VVUQ, and regulatory perspectives.
Format: Live instruction
For More Information
For more information about this course, please contact:
TMS Meeting Services
5700 Corporate Drive Suite 750
Pittsburgh, PA 15237
U.S. and Canada Only: 1-800-759-4867
Other Countries: 1-724-776-9000, ext. 241