34th International Test and Evaluation Symposium (2017)
T&E in a Time of Risk and Change
Oct 2 & 5 - Symposium Tutorials
Oct 3-5 - Symposium Plenary and Technical Sessions
Jointly Hosted by the ITEA George Washington,
Hampton Roads, Francis Scott Key, and Southern Maryland Chapters.
Hyatt Regency ~ 1800 Presidents Street ~ Reston, VA 20190
1800 Presidents Street
Reston, VA 20190
Please go HERE to book your room NLT Sep. 18th
ITEA has a room block available to all of our attendees and we are truly pleased to offer our attendees the FY18 prevailing government per diem rate of $250/night (exclusive of taxes).
NOTE: These tutorials require a separate fee from the Symposium.
Single 4-hour Tutorial - $205
Two 4-hour Tutorials - $385 (use discount code "TWO-Tutorials" at check out)
Three 4-hour Tutorials - $545 (use discount code "THREE-Tutorials" at check out)
When registering ONLINE for the Tutorials, be aware that there are tutorials on BOTH Monday and Thursday.
The Tutorials are listed on TWO (2) PAGES when you are registering ONLINE.
PLEASE MAKE SURE THAT YOU CLICK ON BOTH PAGE 1 AND PAGE 2 OF THE TUTORIALS.
REGISTER ONLINE NOW
MONDAY - October 2, 2016
- 8:00 a.m. – 12:00 p.m. Morning Tutorials
- 1:00 p.m. – 5:00 p.m. Afternoon Tutorials
- The Art of Planning Preview T&E: Australian Techniques for Early Test Strategies for Technical Maturation and Risk Reduction - Group Captain Keith F. Joiner, PhD, (Royal Australian Air Force, Ret'd), CSC, University of New South Wales, Australia
- Data Science and Its Relationship to Test & Evaluation - Mark J. Kiemele, PhD, Air Academy Associates
- Identifying Requirements and Vulnerabilities for Cybersecurity; or How We Learned to Stop Worrying and Love the Six-Phase Cybersecurity T&E Process - Michael Lilienthal, PhD, CTEP, Director of Cyber and Navy Programs, Electric Warfare Associates, and Mr. Patrick Lardieri, Lockheed Martin Corporation
- Software Assurance - Bob Martin, Senior Secure Software & Technology Principal Engineer, MITRE
- Test and Evaluation Across the Acquisition Lifecycle - Michael Flynn, PhD, CTEP, Defense Acquisition University
- Using TENA and JMETC to Reduce Risk, Saving Time and Money – Gene Hudgins, KBRWyle
THURSDAY - October 5, 2016
- 1:00 p.m. – 5:00 p.m. Afternoon Tutorials
- Planning and Executing Cyber Table Tops, Facilitator Training - Sarah Standard, Cybersecurity/Interoperability Technical Director, Office of the Secretary of Defense, AT&L, DASD (DT&E).
- Real-World DOE and Modern Design and Analysis Methods - Thomas A. Donnelly, PhD, CAP, SAS Institute Inc.
- Test and Evaluation Science and Technology - George Rumford, Deputy Director, Major Initiatives and Technical Analyses, DoD TRMC, AT&L, and Program Manager for the T&E/S&T Program
Cybersecurity Test & Evaluation
Instructor: Pete Christensen, CTEP – Cyber Support to OSD Programs, The MITRE Corporation
Now more than ever, Program Managers (PM) must ensure that cybersecurity be given careful consideration throughout the system lifecycle. Specifically this includes identifying cybersecurity requirements early in the acquisition and systems engineering lifecycle. Initiating a focus on cybersecurity earlier will provide PMs the opportunity to give careful consideration, upfront, to related cybersecurity testing activities that can be integrated into the engineering planning and design phases. Results of informal cybersecurity testing can then be applied to influence design and development efforts and to posture programs for success in Developmental Test (DT) and Operational Test (OT). The Deputy Assistant Secretary of Defense (DASD) Developmental Test and Engineering (DT&E) has collaborated with key systems engineering stakeholders to develop disciplined processes that will assist Program Managers (PM) in implementing an incremental and iterative phased approach to develop cyber secure systems. The National Cyber Range (NCR), under the purview of the Test Resource Management Center (TRMC), is a resource that can be leveraged by PMs to support cybersecurity testing. This presentation will provide an overview of the cybersecurity test and evaluation phased approach and the NCR.
Data Science and Its Relationship to Test & Evaluation
Instructor: Mark Kiemele, Ph.D. – President, Air Academy Associates
In a data-driven economy, industry and government leaders rely increasingly on skilled professionals who can see the significance in data and use data analytic techniques to properly collect data, solve problems, create new opportunities, and shape change. Data science can be defined as the art and science of solving problems and shaping decisions through the precise collection and analysis of data. This tutorial is intended for executives, leaders, managers, and practitioners who need to know how their critical thinking can be impacted by such things as Big Data, Predictive Analytics, Design of Experiments (DOE) and other tools in the Data Science toolkit. This tutorial will cover the need for critical thinking as well as a high-level view of a variety of data analytic tools that can be used to enhance critical thinking. Even if one never designs a test or evaluates its results, this tutorial participant will be able to explain the uniqueness of DOE and why big data and predictive analytics are needed to generate the analytical capability every organization needs.
How to Build a Reliability Growth Program
Instructors: Shawn Brady and Wayne Martin, AMSAA, Center for Reliability Growth
Reliable systems are more likely to be fielded sooner, more likely to be available when the Soldiers need them, and more likely to reduce maintenance costs over the system's life cycle. Unfortunately, many programs in Department of Defense (DoD) fail to produce reliable systems. As a member of the defense acquisition community, are you armed with the knowledge and tools needed to help the DoD develop, test, and field more reliable systems? This tutorial will help you answer "yes".
For the past decade, the Army's Center for Reliability Growth at the Army Materiel Systems Analysis Activity (AMSAA) has been working to improve Army and DoD reliability by providing policy, guidance, standards, methods, tools, and training. This tutorial provides an overview of the latest methods and tools that DoD analysts should consider when managing or supporting a reliability program. Specific topics include adequately contracting and designing for reliability, identifying and mitigating reliability risks early using AMSAA's Reliability Scorecards, determining appropriate reliability test durations, building a realistic reliability growth plan using AMSAA's Planning Model Based on Projection Methodology (PM2), and projecting the anticipated improvement in reliability using the AMSAA Maturity Projection Model (AMPM).
Identifying Requirements and Vulnerabilities for Cybersecurity; Or How we Learned to Stop Worrying and Love the Six-Phase Cybersecurity T&E Process
Instructors: Mike Lilienthal, Electronic Warfare Associates and Patrick “Preacher” Lardieri, Lockheed Martin
Many Service acquisition, System Engineering (SE), and Test and Evaluation (T&E) teams are starting to move their programs from “checklist information assurance or compliance” cyber security approach to a proactive, iterative risk management process with the goal of ensuring personnel can still carry out their duties in a cyber contested environment. Many people are struggling to formulate a practical and effective approach to develop requirements and a plan to incorporate cyber security into their SE and T&E activities using the recent spate of cybersecurity policies and guidelines released by the Office of the Secretary of Defense (OSD). This tutorial will step through the use of the Navy’s Cyber Table Top (CTT) Wargaming Process and the National Cyber Range’s cyber security evaluation testing process as an approach to gain actionable cyber threat understanding. The tutorial will also show how the use of the CTT and the NCR support execution of DOT&E’s Six Phase Cybersecurity T&E process.
The CTT (which has been adopted by the Navy) is a rigorous, intellectually intensive and interactive data collection and analysis process that introduces and explores the potential effects of cyber offensive operations on the capability of a system to carry out its designed functions. It produces a prioritized list of actionable recommendations to support more informed decisions and tradeoffs in a fiscally constrained environment. Personnel using the process are better able to identify threat vectors, understand the vulnerabilities and mission risks of their system under development, and understand cyber threat consequences categorized by their impact and their likelihood of successful attacks. This helps scope the cyber security testing done at the NCR and other places. The tutorial will also show how the use of the cyber wargaming process in conjunction with the NCR will inform systems engineers on tradeoffs and potential workarounds to prevent or minimize cyber effects. The tutorial is based on the lessons learned from using the process and the NCR to support NAVAIR and SPAWAR acquisition programs. It is intended for use by Acquisition Program Management Offices, Systems Engineers, Chief Developmental Testers, and Lead Developmental Test and Evaluation (DT&E) Organizations. In short, this tutorial will introduce how the cyber wargame and the NCR iteratively support the development of systems that will be more resilient and survivable in hostile cyber threat environments.
Planning and Executing Cyber Table Tops, Facilitator Training
Instructor: Sarah Standard, Cybersecurity/Interoperability Technical Director, OSD AT&L, DASD DT&E
The primary objective of the Cyber Table Top (CTT) Facilitator Training Workshop is to build the knowledge, skills and abilities that will allow trainees to successfully construct, coordinate, organize, and execute a Cyber Table Top (CTT) exercise. The primary audience for this training are those personnel who will facilitate and moderate CTT’s for their program, command. The training will include tips, tools, and resources for CTT facilitators as well as a practical example of the process and outputs.
Processes for Testing with International Partners
Instructors: Gloria Deane and Mitchell Dossett, DOT&E International Programs
Defense budgets are shrinking; requirements for complex systems and systems–of–systems are increasing; and interoperability with allies is becoming the norm by necessity. These are challenges all nations are facing. Duplicative testing is inefficient for all nations, so sharing of “test resources” is highly desirable. "Test resources" includes test facilities, open air ranges and operating areas, laboratories, equipment, expertise, methods, data, and funds. Upon making the decision to test, participants must complete certain administrative actions to implement a test program. To test with an international partner an international agreement must be in force. To test under such an agreement, the partnering nations must negotiate and approve a project arrangement. The laws of sovereign nations govern such activity and DOD has developed administrative processes to ensure statutory compliance. The Office of the Director, Operational Test and Evaluation (DOT&E) will offer a tutorial to inform members of the test community of the capabilities and limitations of the international Test and Evaluation Program and how to develop project arrangements with an individual and with multiple partnering nations. Speakers will be representatives from the Office of the Director, International Cooperation in the Office of the Undersecretary of Defense for Acquisition, Technology, and Logistics, the International Test and Evaluation team within DOT&E, and international partners with whom the DOD test community has worked for many years.
Speakers will be representatives from the Office of the Director, International Cooperation in the Office of the Undersecretary of Defense for Acquisition, Technology, and Logistics, the International Test and Evaluation team within DOT&E, and international partners with whom the DOD test community has worked for many years.
Real-World DOE and Modern Design and Analysis Methods
Instructor: Thomas A. Donnelly, PhD, CAP, SAS Institute, Inc.
Part 1: Custom DOE – Making Your Design Fit the Problem
This tutorial will present solutions to real-world Design of Experiment (DOE) problems. You will learn how to treat in-combination, factors of the following types: continuous/quantitative, categorical/qualitative, discrete numeric, mixture, covariate, blocking, and hard-to-change. It will demonstrate how to constrain design regions. Algorithmic custom DOE is the most efficient way to develop accurate and useful models of complex real-world processes.
Part 2: Using Definitive Screening Designs to Get More Information from Fewer Trials
Learn to use the new Definitive Screening Design (DSD) method of Design of Experiments. DSDs not only efficiently identify important factors but can often support second-order predictive models. For the same number of factors three-level DSDs are often smaller than popularly used 2-level fractional-factorial (FF) designs yet yield more information especially about curvature for each factor. A case study will be shown in which a 10-factor process is optimized in just 24 trials. In cases where too many factors are significant and the design can't collapse into a one-shot design, existing trials can economically be augmented to support a response-surface model in the important factors.
Part 3: Strategies for Analyzing Modern Screening Design of Experiments
The new Definitive Screening Designs (DSD) provide clean estimates of all main effects and squared effects for the design factors. This leads to saturated or nearly saturated models and the potential to falsely identify lower power squared terms as important. Effective strategies for analyzing these designs are reviewed to build a consensus model from the data. Plus, a newly developed (2015) method for robustly determining the most likely model will be featured. In this tutorial, we examine several strategies for analyzing DOE data sets. Actual vs. Prediction plots with checkpoints can be used to help choose models.
Instructor: Bob Martin – Senior Secure Software & Technology Principal Engineer, MITRE
This introductory tutorial provides an overview of the Cyber domain while providing the DoD and industry T&E practitioner the necessary information, perspectives, understanding, and tools to work effectively in this space. This tutorial provides an introduction to the domain of Cyber, including its concepts, relevant systems (defensive and offensive), and testing considerations. Discussions will cover key differences in testing cyber systems versus conventional military systems and test methodologies. Also, we will identify current and future implications for the T&E community. Additional topics include a general overview of cyber warfare, vulnerability analysis, malware, and associated threat vectors pertaining to system testing.
T&E 1–2–3, The Fundamentals
Instructor: Matt Reynolds – Test and Evaluation Consulting
This tutorial is designed to describe the evolution of T&E over the course of the last several decades, as well as to explain the timeless concepts and precepts that apply to all testing. The literature on T&E is replete with policies and practices that have served the needs of specific generations of systems, of technologies and of acquisition strategies. But little has been published that describes the universal principles that underlie those policies. An understanding of these principles and smart implementation of them are critical to the success of complex T&E programs. The primacy of thorough planning, contingency strategies, statistics-based test design, enterprise level thinking, and a thorough understanding of customer requirements (both stated and unstated) will be addressed, and will be reinforced by lessons learned from the past programs.
Test and Evaluation Across the Acquisition Lifecycle
Instructor: Michael Flynn, PhD, CTEP - Defense Acquisition University
This tutorial will focus on the latest DoDI 5000.02 guidance for defense acquisition process from a Test and Evaluation perspective with emphasis on the involvement in the Systems Acquisition Lifecycle and T&E's relationship to the Systems Engineering processes used throughout the lifecycle of major acquisition programs from requirements generation, through Post Milestone C. Coverage will include the relationship between the Test and Evaluation Master Plan (TEMP), and Systems Engineering Plan (SEP) as they proceed through each of the major Milestone phases. Focus will be on the major events that occur during each phase of acquisition, required documentation, and expected entrance and exit criteria for successfully achieving approval. The intended audiences are engineers, program managers, and industry for an understanding of DoD acquisition in relationship to T&E's involvement.
Test and Evaluation Science and Technology
Instructor: Mr. George Rumford, Deputy Director, Major Initiatives and Technical Analyses, DoD TRMC, AT&L, and Program Manager for the T&E/S&T Program
The T&E/S&T Program develops test technologies that will enable future test capabilities to characterize and optimize the performance of emerging warfighting systems, being developed to advance the third offset strategy. Technology areas of focus include autonomy, electronic warfare, cyber warfare, future computing, micro-electronics, hypersonics, directed energy amongst others. This tutorial provides the key attributes of a successful test technology development project. Attending this session will help those unfamiliar with the T&E/S&T Program develop test technology solutions that satisfy T&E needs. The course will also discuss how to structure a test technology project to assess technology maturation from concept exploration, through engineering, integration and experimentation, and ultimately reaching technology transition.
The Art of Planning Preview T&E: Australian Techniques for Early Test Strategies for Technical Maturation and Risk Reduction
Instructor: Group Captain Keith F. Joiner (Royal Australian Navy Ret.), PhD, CSC, University of South Wales, Australia
This four-hour tutorial will benefit anyone who is involved in planning or conducting early T&E to de-risk and shape more successful projects. Such participants are likely to have been part of such planning processes before, but this workshop is an opportunity for them to examine a fresh systematic approach and see where their previous processes and personal master test planning skills might be made more robust. Western governments continue to find an unacceptable proportion of projects fail to deliver the capability sought and that inadequate early T&E or trialing is a significant factor in the risks not being determined early enough for them to be mitigated. In a Senate inquiry into Defence procurement (2012, especially Ch. 2 & 12) this was found to be some ten percent of projects by value. A more recent report on broader Australian government public project failings (Shergold Report, 2015) found systemic inability to identify and plan early trialing as part of scoping projects. New Defence T&E policy was implemented in Australia from 2013-14 to systematically plan and conduct de-risk or preview T&E (See Dr Joiner article ITEA Journal Dec 2015). Focused workshops ensure preview T&E is driven by significant technical and operational risk into a program of key confirmatory demonstrations, configuration audits and user trials. Within the U.S. DoD, such early T&E would typically occur during the Technical Maturation and Risk Reduction (TMRR) lifecycle phase and thus would be planned and funded in the Analysis of Alternatives (AOA) phase at Milestone A. The Australian planning technique has now been confirmed in Defence T&E policy updates (2016) and is taught at the leading Defence university in Australia, University of New South Wales Canberra, as part of all postgraduate master programs in system engineering and project management. Workshop participants will be given an overview of the workshop process and use a hypothetical capability requirement to role-play the workshops, so as to determine indicative outcomes of each phase of the hypothetical project. Two Australian examples will then be covered where such planning was positively used and another where it was comparatively ignored in order to contrast the benefits to de-risking projects through such early T&E. At the end of the workshop, students will have a chance to reflect back to the group on their own possible chances to have previously used such processes.
Using TENA and JMETC to Reduce Risk, Saving Time and Money
Instructor: Gene Hudgins – TENA and JMETC User Support Lead, KBRWyle
Together, TENA and JMETC enable interoperability among ranges, facilities, and simulations in a timely and cost-efficient manner. TENA provides for real-time system interoperability, as well as interfacing existing range assets, C4ISR systems, and simulations; fostering reuse of range assets and future software systems. JMETC is a distributed, LVC capability which uses a hybrid network architecture; the JMETC Secret Network (JSN), based on the SDREN, is used for secret testing and the JMETC Multiple Independent Levels of Security (MILS) Network (JMN) is the T&E enterprise network solution for all classifications and cyber testing. JMETC provides readily-available connectivity to the Services' distributed test and training capabilities and simulations, as well as industry resources. This tutorial addresses using the well-established TENA and JMETC tools and capabilities to reduce risk in an often uncertain environment; regularly saving ranges time and money in the process.