Distributed Workflows for Modeling Experimental Data
- ORNL
- University of Southern California, Information Sciences Institute
Modeling helps explain the fundamental physics hidden behind experimental data. In the case of material modeling, running one simulation rarely results in output that reproduces the experimental data. Often one or more of the force field parameters are not precisely known and must be optimized for the output to match that of the experiment. Since the simulations require high performance computing (HPC) resources and there are usually many simulations to run, a workflow is very useful to prevent errors and assure that the simulations are identical except for the parameters that need to be varied. These workflows are usually distributed because the simulations require HPC, but the optimization and steps to compare the simulation results and experimental data are done on a local workstation. We will present results from force field refinement of data collected at the Spallation Neutron Source using Kepler, Pegasus, and Beam workflows and discuss what we have learned from using these workflows.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1410941
- Resource Relation:
- Conference: 2017 IEEE High Performance Extreme Computing Conference - Waltham, Massachusetts, United States of America - 9/12/2017 12:00:00 AM-
- Country of Publication:
- United States
- Language:
- English
Similar Records
Co-scheduling Ensembles of In Situ Workflows
Accelerating Scientific Workflows on HPC Platforms with In Situ Processing