NATO Code of Best Practice (COBP) for C2 Assessment

Similar documents
Intelligent Agent Technology in Command and Control Environment

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

General study plan for third-cycle programmes in Sociology

12 th ICCRTS Adapting C2 to the 21st Century. COAT: Communications Systems Assessment for the Swedish Defence

SEDETEP Transformation of the Spanish Operation Research Simulation Working Environment

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Uncertainty concepts, types, sources

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

10.2. Behavior models

The Good Judgment Project: A large scale test of different methods of combining expert predictions

b) Allegation means information in any form forwarded to a Dean relating to possible Misconduct in Scholarly Activity.

CORE CURRICULUM FOR REIKI

Initial teacher training in vocational subjects

Geo Risk Scan Getting grips on geotechnical risks

DSTO WTOIBUT10N STATEMENT A

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

Qualification Guidance

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

AD (Leave blank) PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland

Presentation Advice for your Professional Review

5 Early years providers

Delaware Performance Appraisal System Building greater skills and knowledge for educators

ASSESSMENT GUIDELINES (PRACTICAL /PERFORMANCE WORK) Grade: 85%+ Description: 'Outstanding work in all respects', ' Work of high professional standard'

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

The feasibility, delivery and cost effectiveness of drink driving interventions: A qualitative analysis of professional stakeholders

Politics and Society Curriculum Specification

A guidance for assessing and communicating uncertainties

Code of Practice on Freedom of Speech

School Size and the Quality of Teaching and Learning

Consent for Further Education Colleges to Invest in Companies September 2011

Personal Tutoring at Staffordshire University

HARPER ADAMS UNIVERSITY Programme Specification

Assessment and Evaluation

Last Editorial Change:

e-portfolios in Australian education and training 2008 National Symposium Report

Qualification handbook

Writing for the AP U.S. History Exam

Commanding Officer Decision Superiority: The Role of Technology and the Decision Maker

Program Assessment and Alignment

Assessment Pack HABC Level 3 Award in Education and Training (QCF)

Lecture 1: Machine Learning Basics

What is PDE? Research Report. Paul Nichols

Business. Pearson BTEC Level 1 Introductory in. Specification

Higher Education Review (Embedded Colleges) of Kaplan International Colleges UK Ltd

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

(Still) Unskilled and Unaware of It?

Anglia Ruskin University Assessment Offences

ACADEMIC AFFAIRS GUIDELINES

Unit 7 Data analysis and design

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

Audit Documentation. This redrafted SSA 230 supersedes the SSA of the same title in April 2008.

A non-profit educational institution dedicated to making the world a better place to live

CHAPTER 4: RESEARCH DESIGN AND METHODOLOGY

Three Crucial Questions about Target Audience Analysis

1. Programme title and designation International Management N/A

Practice Examination IREB

CONCEPT MAPS AS A DEVICE FOR LEARNING DATABASE CONCEPTS

UK flood management scheme

DISCIPLINARY PROCEDURES

STEPS TO EFFECTIVE ADVOCACY

Probability estimates in a scenario tree

Dyslexia and Dyscalculia Screeners Digital. Guidance and Information for Teachers

DOES OUR EDUCATIONAL SYSTEM ENHANCE CREATIVITY AND INNOVATION AMONG GIFTED STUDENTS?

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Curriculum for the Academy Profession Degree Programme in Energy Technology

UML MODELLING OF DIGITAL FORENSIC PROCESS MODELS (DFPMs)

Author: Justyna Kowalczys Stowarzyszenie Angielski w Medycynie (PL) Feb 2015

WP 2: Project Quality Assurance. Quality Manual

PUPIL PREMIUM POLICY

Improving the impact of development projects in Sub-Saharan Africa through increased UK/Brazil cooperation and partnerships Held in Brasilia

IUPUI Office of Student Conduct Disciplinary Procedures for Alleged Violations of Personal Misconduct

Conceptual Framework: Presentation

Book Review: Build Lean: Transforming construction using Lean Thinking by Adrian Terry & Stuart Smith

Stakeholder Engagement and Communication Plan (SECP)

University of Toronto

ECE-492 SENIOR ADVANCED DESIGN PROJECT

Oklahoma State University Policy and Procedures

A Note on Structuring Employability Skills for Accounting Students

Standards for Professional Practice

Oakland Schools Response to Critics of the Common Core Standards for English Language Arts and Literacy Are These High Quality Standards?

University of Groningen. Systemen, planning, netwerken Bosman, Aart

School Inspection in Hesse/Germany

Utilizing Soft System Methodology to Increase Productivity of Shell Fabrication Sushant Sudheer Takekar 1 Dr. D.N. Raut 2

INTRODUCTION TO TEACHING GUIDE

Providing Feedback to Learners. A useful aide memoire for mentors

Copyright Corwin 2015

Nottingham Trent University Course Specification

Exclusions Policy. Policy reviewed: May 2016 Policy review date: May OAT Model Policy

Information Sheet for Home Educators in Tasmania

Interview on Quality Education

Conference Paper excerpt From the

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

BISHOP BAVIN SCHOOL POLICY ON LEARNER DISCIPLINE AND DISCIPLINARY PROCEDURES. (Created January 2015)

Implementing a tool to Support KAOS-Beta Process Model Using EPF

Planning a research project

Litterature review of Soft Systems Methodology

Community engagement toolkit for planning

Common Core State Standards for English Language Arts

CHAPTER 4: REIMBURSEMENT STRATEGIES 24

Transcription:

NATO Code of Best Practice (COBP) for C2 Assessment Graham L. Mathieson, UK PR10-1

Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 00 DEC 2003 4. TITLE AND SUBTITLE Risk and Uncertainty 2. REPORT TYPE N/A 3. DATES COVERED - 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) DSTL C134, East Court Portchester Hill Road Fareham Hampshire PO17 6AD UNITED KINGDOM 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR S ACRONYM(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release, distribution unlimited 13. SUPPLEMENTARY NOTES See also ADM001657., The original document contains color images. 14. ABSTRACT 15. SUBJECT TERMS 11. SPONSOR/MONITOR S REPORT NUMBER(S) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified 18. NUMBER OF PAGES 46 19a. NAME OF RESPONSIBLE PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

Notes for Slide 2 The revised Code of Best Practice introduces a new section to deal explicitly with Risk and Uncertainty issues. Risk and uncertainty are increasingly important for all assessment studies because of the nature of the Information Age security environment and the fact that we are in a period of transition, giving rise to an increasing breadth of the mission spectrum & uncertainty about Information Age concepts and technologies, and their impacts. The most radical changes are to be seen in C2 concepts and capabilities. This makes Risk and Uncertainty a particularly critical feature of study design and implementation for C2 assessment. PR10-2

Risk and Uncertainty We may at once admit that any inference from the particular to the general must be attended with some degree of uncertainty, but this is not the same as to admit that such inference cannot be absolutely rigorous, for the nature and degree of the uncertainty may itself be capable of rigorous expression. R.A.Fisher, from "The Design of Experiments", 1942 PR10-3

Notes for Slide 4 This classic quote from R.A. Fisher expresses a central theme of the guidance given in the revised Code. PR10-4

Risks in C2 Assessment Risk is the possibility of suffering harm or loss Risks inherent in the decision supported by assessment e.g. the risk of deciding on a harmful course of action Risks to the safe delivery of the assessment e.g. the risk of delivering misleading advice Adopting the NATO COBP will help to minimise risks PR10-5

Notes for Slide 6 There are a number of technical and colloquial definitions of risk, but in general it can be described as the possibility of suffering harm or loss. For assessment studies, two sorts of risk are important: risks inherent in the decision which is being supported by the assessment - the risk of choosing a less favourable course of action, or even a damaging one - and risks whose impact would mean that the assessment delivers unsafe advice - resulting in a risk of the first type. If assessments do not control this second type of risk, decision-makers may well control the first kind by ignoring the assessment altogether. Adopting the NATO COBP will help to minimise the risk of delivering misleading advice and, hence, will reduce decision-maker risks. PR10-6

Risk in Decision Making Before decision/action Possibility of loss (or gain!) Uncertainty about outcome After decision/action Actuality of loss (or gain!) Certainty about course of action Decision/Action PR10-7

Notes for Slide 8 It is important to understand that risks exist before a decision event (or, more strictly, the action resulting from it). They describe the future possibility of loss (or gain!) and are based upon uncertainty about the outcome or consequence of a decision event. After the event, risks turn into actual loss or gain and there is certainty about the course of action. So, whenever risk is considered, uncertainty is an inherent part of it. PR10-8

Uncertainty in C2 Assessment We can be uncertain about Which of a set of known outcomes will arise (= known risk) Probabilities of known outcomes What outcomes are possible Value of outcomes (risk impacts) Current state Perceptions of other actors... PR10-9

Notes for Slide 10 There are many dimensions of uncertainty in C2 assessment. Even where all possible outcomes of action are known in advance, there is still uncertainty over which outcome will actually arise. This is termed a known risk. In complex problems, such as those presented by C2 assessments, there is often uncertainty over what outcomes are possible and over the absolute or relative value of those outcomes, i.e. risk impacts. Part of this arises from an uncertainty about the current state of the system of interest to the study, and it must also be remembered that different problem stakeholders may have different perceptions of the problem and the risks inherent in it. Other areas of uncertainty include... PR10-10

Uncertainty in C2 Assessment - 2 Other areas of uncertainty include... Parameter value uncertainty; Model-based uncertainty; Uncertainty of focus (including uncertainty of scenario); Complexity of uncertain factors (i.e. their dimensionality). PR10-11

Notes for Slide 12 Uncertainty over the values of parameters and factors of the problem. C2 problems typically contain difficult-to-quantify concepts for which analysis tries to define practical approximations. There can be uncertainty over the accuracy or validity of the representations included in models used to formulate and solve the assessment problem. This type of uncertainty is often hidden and needs especial care to deal with effectively. C2 studies typically have a rich context and there can be uncertainty over whether the assessment has accounted for all the important factors and issues (including appropriately broad selections of scenario). Finally, the factors involved in C2 assessments are often complex and multi-dimensional. This can make it impossible to practically cover all possible outcomes within the scope of the assessment, leading to uncertainties over the correctness of inferences drawn from the assessment conclusions. PR10-12

Uncertainty in C2 Assessment - 3 OOTW studies have less well-formed quantitative factors and more qualitative factors, including. social and political activity impacting the tactical level, negotiation and persuasion as opposed to coercion, non-optimal performance of military capabilities from a technical perspective due to their poor fit to the problem, severe Rules of Engagement constraints, as well as unclear or evolving goals and objectives. The nature of these factors makes assessment more difficult. PR10-13

Notes for Slide 14 OOTW studies, typically have less well-formed factors, which leads to a higher incidence of problems being formulated on the basis of qualitative factors. Areas where quantitative assessment can prove difficult are listed here. PR10-14

Variables relevant to decision-making INPUTS Sensor Information (with variability) Personal: - Culture - Style - etc. CONSTRAINTS Decision Making Institutional: - Culture - Doctrine - etc. OUTPUTS Action Instructions (with variability) RESOURCES Personal: - memory - experience/knowledge - skill/expertise - etc. Institutional: - memory - information technology - administrative support - etc. PR10-15

Notes for Slide 16 This slide illustrates, for example, the wide range of variables that might need to be considered in an assessment problem involving the assessment or representation of decision-making. These are shown using a standard IDEF formulation comprising inputs, outputs, constraints and resources. Constraint and resource variables can be categorised into Personal and Institutional. It may be noted that many of these variable are not practically controllable within an assessment or experimental context, and therefore, become a source of uncertainty. PR10-16

Uncertainty in C2 Assessment - 4 It is impossible to know everything about a problem. Adequately complete knowledge can be better assured by explicit use of checklists to highlight the breadth of factors involved in C2 assessments. The revised COBP provides a variety of useful lists, but cautions that they are no substitute for critical thinking. PR10-17

Notes for Slide 18 Philosophically, it is impossible to know everything about a problem or to have perfectly precise and unambiguous knowledge of all factors. Nevertheless, the Code recommends the use of checklists to help ensure an adequate coverage is achieved, and it offers a variety of checklists that have proved useful to the nations contributing. The code cautions, however, that checklists are no substitute for critical thinking about the problem and should only be used as complementary aids. PR10-18

Dealing with risk Reduce uncertainty Mitigate impacts Communicate risks PR10-19

Notes for Slide 20 As mentioned previously, uncertainty is inherent in risk and dealing with uncertainly is a key part of dealing with risk. In essence, there are three ways to counter risks: Firstly, one can reduce the uncertainty underlying the risk, particularly uncertainty over the likelihood of a risk arising. Secondly, one can mitigate the impacts of risks, thus rendering them less effective. Finally, when all is said and done, some residual risks will remain and it is vital to communicate these clearly and sensitively to the decision-maker. Taking each topic in turn... PR10-20

Reducing uncertainty and risk Risk and uncertainty can never be eliminated. Assessments can be judged by by how they reduce uncertainty and decision-maker risk. Teams need to learn about the robustness (or lack thereof) of the study conclusions. Sensitivity analysis is a key tool for this. PR10-21

Notes for Slide 22 Uncertainty, and hence, risk, can never be completely eliminated in any real study. It is unhelpful and unnecessary to seek to produce totally certain conclusions, because it may lead to false confidence and actually increase decision-maker risk. Instead, assessments should explicitly accept that their conclusions will be uncertain and should judge themselves on whether the issues are less uncertain after the assessment than before, I which case the decisionmaker s risk has been reduced. Having accepted the uncertainty in their outputs, study teams need to learn about how robust their advice is in the face of those uncertainties. Sensitivity analysis is a key tool for this. PR10-22

Reducing uncertainty and risk - 2 Treat uncertainty consistently and explicitly. This allows information from two sources to be fused. Otherwise it is more difficult for a study to add value to a decision-maker. PR10-23

Notes for Slide 24 A necessary condition for reducing uncertainty is that the assessment explicitly and consistently expresses the uncertainties at all stages. This will provide the necessary raw material for managing the uncertainty and hence reducing risk. For example, an explicit treatment of uncertainty will allow a rational basis for fusing knowledge from multiple sources and getting maximum leverage. A lack of explicit treatment of uncertainty means that the analyst must end up selecting between different sources rather than merging them, and this makes it more difficult to add value. PR10-24

Mitigating risk impacts Difficult to keep C2 assessment rigorous and robust in the face of uncertainty and complexity. Need to use a rich combination methods enhances difficulty. Checklists useful to improve rigour of assessment. Multi-factorial experimental design methods PR10-25

Notes for Slide 26 Mitigating risk impacts involves strategies to limit the knock-on consequences of individual risks. It is difficult to keep C2 assessment rigorous and robust in the face of the many uncertainties and complexities inherent in the subject. Also, the need to use multiple methods in concert to solve many C2 assessment problems only exacerbates the difficulty. Again, the use of checklists and risk management tools can improve the rigour, and hence the reliability of of assessment. One of the key risks for C2 assessment arises from the fact that C2 problems, particularly in OOTW contexts, typically have many interacting factors, many of which are poorly understood. This fact makes it unsafe to rely upon simple, single factor sensitivity analysis as the basis for testing robustness. The Code recommends multi-factorial experimental design methods in these circumstances. Another key mitigation against risks is good problem formulation. PR10-26

Mitigating risk impacts -2 In C2 assessments, analysts need to be particularly alert to the possibility of chaotic behaviours arising from dynamic interactions. Human and organizational factors are particularly prone to this type of instability. A sound and explicit treatment of boundaries and system definitions during problem formulation is essential to managing this aspect of the assessment. Holistic systems thinking and complexity-based analysis may be needed for this purpose. PR10-27

Notes for Slide 28 The complex nature of many C2 problems means that analysts need to be particularly alert to the possibility that complex systems behaviour, including chaotic behaviour, may be present. This is particularly true where human and organisational factors play a large part in the problem being studied. As mentioned yesterday, a sound and explicit treatment of boundaries and system definitions during problem formulation is a key element to managing the impact of complexities here. Holistic systems thinking and analysis exploiting the emerging understanding of complexity-based thinking may be needed in this area. PR10-28

Risk-based analysis Solving problems using single expected values leads to fragile solutions, which don t allow decision-makers to deal with inherent uncertainty and risk. A risk-based approach can overcome some major pitfalls focus on the multiplicity of possible outcomes opening up the possibility of richer solutions portfolios of action robustness.vs. narrow optimality. PR10-29

Notes for Slide 30 It is common for assessments to formulate their solutions in terms of single, expected values for problem parameters. This, typically, leads to fragile results which do not allow decision-makers to understand or deal with the inherent uncertainties of the problem. A risk-based approach is recommended to overcome some of the major pitfalls of expected value solutions. Risk-based analysis puts a focus on the multiplicity of possible outcomes and opens up the possibility of richer solutions involving portfolios of actions and a robustness of approach rather than narrow optimisation. PR10-30

Risk-based analysis - 2 Different people have different worldviews and different approaches to risk taking. Risk-based analysis needs metrics for risks and failure as well as success and benefits. Portfolio-based solutions can be associated with cost-benefit approaches, but this has not been common in practice. PR10-31

Notes for Slide 32 In adopting a risk-based analysis approach it is important to recognise that people differ, both in their world views and in their approach to risk taking. Also, risk-based analysis requires the development of metrics for risk and failure, as well as the more conventional measures of success. Portfolio-based solutions can be linked to conventional cost-benefit analyses, but this is not common in practice in the NATO nations. PR10-32

Managing study risk C2 assessments inherently complex, often poorly understood study problems. C2 problems weakly bounded. Particular risk associated with problem formulation. These factors enhance the level of risk in the design and conduct of the assessment It is therefore advisable not to skip risk analysis even when time and resources are limited. PR10-33

Notes for Slide 34 The inherent complexity of C2 assessments, combined with the fact that C2 problems are often poorly understood and weakly bounded, makes such problems difficult to formulate. Together, these factors enhance the level of risk associated with designing and managing C2 assessments. It is, therefore, strongly recommended in the Code that risk analysis of the assessment itself is too important to skip, even where time and resources are limited. A Generic Risk Register for C2 Assessment has been developed to aid in this task... PR10-34

The Generic Risk Register Companion tool to the COBP, expressing best practice guidance as mitigation for study risks. Available for the existing Code and in development for the revised one. Illustrative example of use from a case study undertaken by the SAS-026 study group: A lack of planned iterations caused a risk of an inefficient and unfocused study with possibly misleading results; and The relatively narrow selection of methodological approaches entailed a risk of misleading conclusions. Study failed to reflect important consequences of varying the C2-system. Possibly biased representation would represent a hidden flaw in conclusions. PR10-35

Notes for Slide 36 The Generic Risk Register is a companion tool to the COBP, expressing best practice guidance as the mitigation to study risks. A version of the risk register based on the first edition of the COBP is currently available, and a revised version is in development. The slide shows an illustrative example of use from a case study undertaken by the SAS- 026 study group. A brief journey of only one hour through the generic risk register turned out very useful, identifying the following risks: The low number of planned iterations in the case study design had the potential to lead to an inefficient and unfocused study with possibly misleading results; The relatively narrow selection of methodological approaches entailed a risk of misleading conclusions. There could be important consequences of varying the C2-system, that were not reflected in the study, and the possibly biased representation would result in a hidden flaw in conclusions. It is worth noting that the case study from which these design flaws were identified was designed by people with an intimate knowledge of the COBP who were explicitly trying to apply it. This demonstrates the critical importance of review and checking, even for expert assessment teams. PR10-36

Communicating risk & uncertainty The high level of uncertainty (and risk) in C2 problems. Communication of risk and uncertainty to study customers, sponsors and stakeholders is of particular importance. Many areas of unresolvable doubt and uncertainty Open, honest communication to decision-makers to avoid misinterpretation of conclusions PR10-37

Notes for Slide 38 The high level of uncertainty (and hence risk) in C2 problems and their assessment mean that the communication of risk and uncertainty to study customers, sponsors and stakeholders is of particular importance. The value of a high quality assessment is that it provides decision-makers with the evidence they need to make better decisions. The nature and quality of evidence required depend upon the decision-maker's approach to and tolerance for risk-taking and his level of prior knowledge of the problem area being assessed. C2 assessments often present many areas of uncertainty which cannot be resolved by analysis and must be presented to decisionmakers. An open and honest communication of these residual uncertainties is critical to avoid misinterpretation of conclusions, including overconfidence in advice given. PR10-38

Communicating risk & uncertainty - 2 Human ability to understand and reason on uncertainty is limited. Different ways of framing results and uncertainties may strongly influence the way results are perceived. Be careful not to overwhelm an audience with details on uncertainties and possible shortcomings. Continuing dialogue about uncertainty will facilitate a common understanding. Possibility that residual uncertainties may make it impossible too draw robust conclusions. PR10-39

Notes for Slide 40 In presenting uncertainty it is vital to remember that the typical human ability to understand and reason on uncertainty is limited. Different ways of framing results and uncertainties may strongly influence the way results are perceived. This should be considered thoroughly to assure compliance with ethical standards. One should be careful not to overwhelm an audience with details on uncertainties and possible shortcomings. However, a continuing dialogue about uncertainty will facilitate a common understanding. Also, the analyst team should be aware of the possibility that residual uncertainties may make it impossible too draw robust conclusions. PR10-40

Communicating risk & uncertainty - 3 Support to decision-making under uncertainty is a vital complementary activity to C2 assessment. PR10-41

Notes for Slide 42 All of this means that support to decision-making under uncertainty is a vital complementary activity to C2 assessment. C2 assessment teams need to include facilitation and consultancy skills as well as sound analysis. PR10-42

Summary Explicit treatment of risk and uncertainty is best practice in all studies, especially C2 assessment. Even when study resources are limited, it is best practice to do sensitivity analyses, and to take a risk-based approach. The use of checklists is recommended to ensure a rigorous treatment The GRR has proved useful PR10-43

Notes for Slide 44 The explicit treatment of risk and uncertainty is best practice in all studies, and is of particular importance in C2 assessment. Even when study resources are limited, it is best practice to include not only an assessment of most likely outcome (result), but to do sensitivity analyses looking for other likely outcomes, and to take a risk-based approach looking for the more extreme possible outcomes (in particular failures). The use of checklists is recommended to ensure a rigorous treatment of risk and uncertainty. A number of examples are presented, but these are not a substitute for critical thinking. The Generic Risk Register has proved useful in managing study risk. PR10-44

Risk and Uncertainty G.L. Mathieson DSTL C134, East Court Portchester Hill Road Fareham Hampshire PO17 6AD UNITED KINGDOM Glmathieson@dstl.gov.uk This paper was received as a PowerPoint presentation without supporting text. Paper presented at the RTO SAS Symposium on Analysis of the Military Effectiveness of Future C2 Concepts and Systems, held at NC3A, The Hague, The Netherlands, 23-25 April 2002, and published in RTO-MP-117. RTO-MP-117 PR10-1

Risk and Uncertainty This page has been deliberately left blank Page intentionnellement blanche PR10-2 RTO-MP-117