Abstract:
|
In December 2009, Dr. J. Michael Gilmore, the Director, Operational Test and Evaluation (DOT&E) promoted design of experiments (DOE) in his initiatives as a preferred methodology for developing rigorous tests of military systems. While it may seem obvious to statisticians that one would want to take a statistical approach to testing new systems, this was far from the case in the DoD in 2009. This presentation will discuss lessons learned from seven years of implementing experimental design policy in Defense Operational Test and Evaluation. The office of DOT&E is responsible for approving all operational and live fire testing within the Department of Defense. Everything ranging from the next strike fighter aircraft, to submarines, to computer systems, must go through an operational test before the program can go into full rate production. Prior to conducting those tests, DOT&E must approve of the adequacy of that test. Following the test DOT&E provides a report to Congress discussing the effectiveness and suitability of the system. Dr. Gilmore made one of his primary initiatives of the office to insist on a statistical basis for determining test adequacy and evaluating the re
|