skip to main content
10.1145/3299815.3314444acmconferencesArticle/Chapter ViewAbstractPublication Pagesacm-seConference Proceedingsconference-collections
research-article

A Case Study of the Use of Design of Experiments Methods to Calibrate a Semi-automated Forces System

Published: 18 April 2019 Publication History

Abstract

Semi-automated forces systems are computer software programs used to generate and model simulated entities, such as tanks and aircraft, in combat simulations. Calibration is an iterative process of executing a simulation model, comparing its results to data describing the modeled system, and adjusting the model to increase its accuracy. Semi-automated forces systems are often calibrated using retrodiction, a method which involves simulating a historical battle and comparing the simulation results to the historical battle's outcome. The Battle of 73 Easting took place during the Gulf War between U. S. and Iraqi forces; its outcome was unexpectedly one-sided. That outcome is not well replicated by most semi-automated forces systems, which typically produce unrealistically high U. S. losses, making calibration of a semi-automated forces systems using retrodiction of that battle problematic. To overcome that difficulty, formal design of experiments methods were used to structure a calibration of a commercial semi-automated forces software system using retrodiction of the Battle of 73 Easting. Four factors were identified as likely to affect the outcome of the simulated battles, and two levels were set for each factor. A full factorial experiment design with two replicates per level combination specified 32 trials, i.e., 32 simulations of the Battle of 73 Easting. Analysis of those trials identified a specific set of factor levels that produced simulation results very consistent with the battle's historical outcome.

References

[1]
R. Atkinson. 1993. Crusade: The Untold Story of the Persian Gulf War. Houghton Mifflin Harcourt, New York, NY.
[2]
O. Balci. 1998. Verification, Validation and Testing. In Handbook of Simulation, Jerry Banks (ed.). John Wiley & Sons, New York, NY, pp. 335--393.
[3]
J. Banks, J. Carson, B. Nelson, and D. Nicol. 2010. Discrete-Event System Simulation. Pearson Prentice Hall, Upper Saddle River, NJ.
[4]
S. Barbosa and M. Petty. 2010. A Survey and Comparison of Past Instances of Combat Model Validation by Retrodiction. In Proceedings of the Spring 2010 Simulation Interoperability Workshop, pp. 12--16.
[5]
S. Biddle. 1996. Victory Misunderstood: What the Gulf War Tells Us about the Future of Conflict. International Security 21, 2: pp. 139--179. https://www.jstor.org/stable/2539073.
[6]
S. Bourque. 1997. Correcting Myths about the Persian Gulf War: The Last Stand of the Tawakalna. The Middle East Journal 51, 4: pp. 566--583.
[7]
G. Cobb. 1998. Introduction to Design and Analysis of Experiments. Springer-Verlag, New York, NY.
[8]
W. Daniels and M. Petty. 2012. Recreating the Battle of 73 Easting in a Constructive Combat Model. In Proceedings of the 2012 AlaSim International Modeling and Simulation Conference, pp. 1--3.
[9]
D. Davis. 1992. 2d ACR at the Battle of 73 Easting. Field Artillery, April: pp. 49--53.
[10]
Department of Defense. 2009. Department of Defense Instruction 5000.61, DoD Modeling and Simulation (M&S) Verification, Validation, and Accreditation (VV&A), December 2009.
[11]
M. Guardia. 2015. The Fires of Babyon: Eagle Troop and the Battle of 73 Easting. Casemate, Havertown, PA.
[12]
A. Law. 2009. How to Build Valid and Credible Simulation Models. In Proceedings of the 2009 Winter Simulation Conference, pp. 24--33.
[13]
D. Macgregor. 2009. Warrior's Rage: The Great Tank Battle of 73 Easting. Naval Institute Press, Annapolis, MD.
[14]
H. McMaster. 2011. What We Learned from the Battle of 73 Easting. Military History, September 2011: pp. 18--19.
[15]
H. McMaster. 2016. Eagle Troop at the Battle of 73 Easting: Lessons for Today's Small Unit Leaders. The Strategy Bridge. https://thestrategybridge.org/the-bridge/2016/2/26/eagle-troop-at-the-battle-of-73-easting.
[16]
D. Montgomery. 2013. Design and Analysis of Experiments. John Wiley & Sons, Hoboken, NJ.
[17]
J. Orlansky and J. Thorpe. 1992. 73 Easting: Lessons Learned from Desert Storm via Advanced Distributed Simulation Technology. Institute for Defense Analyses, Alexandria, VA.
[18]
M. Petty. 1995. Computer Generated Forces and the Turing Test. In Proceedings of the 6th International Training Equipment Conference, April 25-27 1995, pp. 195--204.
[19]
M. Petty. 1995. Case Studies in Verification, Validation, and Accreditation for Computer Generated Forces. In Proceedings of the ITEA "Modeling & Simulation: Today and Tomorrow" Workshop, December 11-14, 1995.
[20]
M. Petty. 2009. Behavior Generation in Semi-Automated Forces. In The PSI Handbook of Virtual Environments for Training and Education, Volume 2, Joseph Cohn, Denise Nicholson and Dylan Schmorrow (eds.). Praeger Security International, Westport, CT, pp. 189--204.
[21]
M. Petty. 2010. Verification, Validation, and Accreditation. In Modeling and Simulation Fundamentals: Theoretical Underpinnings and Practical Domains, John A. Sokolwski and Catherine M. Banks (eds.). Wiley, Hoboken, NJ, pp. 325--372.
[22]
M. Petty, J. Panagos, J. Joseph, and R. Franceschini. 2011. Validation Using Comparison Testing of Three Constructive Combat Models. In Proceedings of the Fall 2011 Simulation Interoperability Workshop, September 2011, pp. 201--212.
[23]
S. Smith and M. Petty. 1992. Controlling Autonomous Behavior in Real-time Simulation. In Proceedings of the Second Conference in Computer Generated Forces and Behavior Representation. Orlando, FL.
[24]
VT-Mak. 2018. VR-Forces. https://www.mak.com/products/simulate/vr-forces

Cited By

View all
  • (2023)A multi-objective optimization problem research for amphibious operational mission of shipboard helicoptersChinese Journal of Aeronautics10.1016/j.cja.2023.03.02936:9(256-279)Online publication date: Sep-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ACMSE '19: Proceedings of the 2019 ACM Southeast Conference
April 2019
295 pages
ISBN:9781450362511
DOI:10.1145/3299815
  • Conference Chair:
  • Dan Lo,
  • Program Chair:
  • Donghyun Kim,
  • Publications Chair:
  • Eric Gamess
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 April 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Accreditation
  2. Design of Experiments
  3. Semi-Automated Forces
  4. Validation
  5. Verification

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ACM SE '19
Sponsor:
ACM SE '19: 2019 ACM Southeast Conference
April 18 - 20, 2019
GA, Kennesaw, USA

Acceptance Rates

Overall Acceptance Rate 502 of 1,023 submissions, 49%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A multi-objective optimization problem research for amphibious operational mission of shipboard helicoptersChinese Journal of Aeronautics10.1016/j.cja.2023.03.02936:9(256-279)Online publication date: Sep-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media