INSPECTIONS FOR SYSTEMS AND SOFTWARE

Similar documents
Generating Test Cases From Use Cases

Implementing a tool to Support KAOS-Beta Process Model Using EPF

For Portfolio, Programme, Project, Risk and Service Management. Integrating Six Sigma and PRINCE Mike Ward, Outperfom

A Context-Driven Use Case Creation Process for Specifying Automotive Driver Assistance Systems

IBM Software Group. Mastering Requirements Management with Use Cases Module 6: Define the System

Quality assurance of Authority-registered subjects and short courses

Evaluation of Systems Engineering Methods, Processes and Tools on Department of Defense and Intelligence Community Programs - Phase II

Infrared Paper Dryer Control Scheme

Initial teacher training in vocational subjects

Experience and Innovation Factory: Adaptation of an Experience Factory Model for a Research and Development Laboratory

Measurement & Analysis in the Real World

Major Milestones, Team Activities, and Individual Deliverables

Independent Assurance, Accreditation, & Proficiency Sample Programs Jason Davis, PE

Functional requirements, non-functional requirements, and architecture should not be separated A position paper

Multi Method Approaches to Monitoring Data Quality

Software Development Plan

Moderator: Gary Weckman Ohio University USA

A European inventory on validation of non-formal and informal learning

The Seven Habits of Effective Iterative Development

November 17, 2017 ARIZONA STATE UNIVERSITY. ADDENDUM 3 RFP Digital Integrated Enrollment Support for Students

Syllabus for ART 365 Digital Photography 3 Credit Hours Spring 2013

Specification of the Verity Learning Companion and Self-Assessment Tool

Problem Solving for Success Handbook. Solve the Problem Sustain the Solution Celebrate Success

Student User s Guide to the Project Integration Management Simulation. Based on the PMBOK Guide - 5 th edition

Process to Identify Minimum Passing Criteria and Objective Evidence in Support of ABET EC2000 Criteria Fulfillment

A Pipelined Approach for Iterative Software Process Model

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Orientation Workshop on Outcome Based Accreditation. May 21st, 2016

Introduction to Modeling and Simulation. Conceptual Modeling. OSMAN BALCI Professor

Jefferson County School District Testing Plan

Developing an Assessment Plan to Learn About Student Learning

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Programme Specification. MSc in International Real Estate

CREATING SHARABLE LEARNING OBJECTS FROM EXISTING DIGITAL COURSE CONTENT

Intermediate Computable General Equilibrium (CGE) Modelling: Online Single Country Course

Math Pathways Task Force Recommendations February Background

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document.

Deploying Agile Practices in Organizations: A Case Study

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Tools to SUPPORT IMPLEMENTATION OF a monitoring system for regularly scheduled series

School Size and the Quality of Teaching and Learning

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Economics 201 Principles of Microeconomics Fall 2010 MWF 10:00 10:50am 160 Bryan Building

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Testing A Moving Target: How Do We Test Machine Learning Systems? Peter Varhol Technology Strategy Research, USA

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

Web as Corpus. Corpus Linguistics. Web as Corpus 1 / 1. Corpus Linguistics. Web as Corpus. web.pl 3 / 1. Sketch Engine. Corpus Linguistics

Promotion and Tenure Policy

Oklahoma State University Policy and Procedures

Geo Risk Scan Getting grips on geotechnical risks

Experiences Using Defect Checklists in Software Engineering Education

Empirical Software Evolvability Code Smells and Human Evaluations

Execution Plan for Software Engineering Education in Taiwan

The Strong Minimalist Thesis and Bounded Optimality

Operational Knowledge Management: a way to manage competence

Running Head: STUDENT CENTRIC INTEGRATED TECHNOLOGY

STRATEGIC GROWTH FROM THE BASE OF THE PYRAMID

TxEIS Secondary Grade Reporting Semester 2 & EOY Checklist for txgradebook

Software Security: Integrating Secure Software Engineering in Graduate Computer Science Curriculum

Early Warning System Implementation Guide

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Tun your everyday simulation activity into research

Evaluation of Usage Patterns for Web-based Educational Systems using Web Mining

Evaluation of Usage Patterns for Web-based Educational Systems using Web Mining

DISTRICT ASSESSMENT, EVALUATION & REPORTING GUIDELINES AND PROCEDURES

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Volunteer State Community College Strategic Plan,

ACADEMIC AFFAIRS GUIDELINES

Ontologies vs. classification systems

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments

CPS122 Lecture: Identifying Responsibilities; CRC Cards. 1. To show how to use CRC cards to identify objects and find responsibilities

Introducing New IT Project Management Practices - a Case Study

Efficient Use of Space Over Time Deployment of the MoreSpace Tool

ECE-492 SENIOR ADVANCED DESIGN PROJECT

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

University of Groningen. Systemen, planning, netwerken Bosman, Aart

PRINCE2 Practitioner Certification Exam Training - Brochure

California s Bold Reimagining of Adult Education. Meeting of the Minds September 6, 2017

10.2. Behavior models

IVY TECH COMMUNITY COLLEGE

TRI-STATE CONSORTIUM Wappingers CENTRAL SCHOOL DISTRICT

SELF-STUDY QUESTIONNAIRE FOR REVIEW of the COMPUTER SCIENCE PROGRAM and the INFORMATION SYSTEMS PROGRAM

State of play of EQF implementation in Montenegro Zora Bogicevic, Ministry of Education Rajko Kosovic, VET Center

CORE CURRICULUM FOR REIKI

Bachelor of Software Engineering: Emerging sustainable partnership with industry in ODL

Minutes of the one hundred and thirty-eighth meeting of the Accreditation Committee held on Tuesday 2 December 2014.

Visit us at:

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

PROCESS USE CASES: USE CASES IDENTIFICATION

MAE Flight Simulation for Aircraft Safety

University Library Collection Development and Management Policy

Platform for the Development of Accessible Vocational Training

Process improvement, The Agile Way! By Ben Linders Published in Methods and Tools, winter

Self Study Report Computer Science

Reference Letter For A Substitute Teacher

Regional Bureau for Education in Africa (BREDA)

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Transfer Learning Action Models by Measuring the Similarity of Different Domains

Computing Curricula -- Software Engineering Volume. Second Draft of the Software Engineering Education Knowledge (SEEK) December 6, 2002

Transcription:

INSPECTIONS FOR SYSTEMS AND SOFTWARE Manuel Mastrofini, Madeline Diep, Forrest Shull, Carolyn Seaman*, and Sally Godfrey** Fraunhofer CESE College Park, MD * Fraunhofer CESE and University Maryland Baltimore County **NASA Goddard Space Flight Center Sponsored by NASA Software Assurance Research Program 1

What is Inspection? Artifact A structured process for finding and fixing defects organizer 1 Planning Planning Form Used to remove defects as early in development as possible inspector Detection 2 Defect Report Form A simplified model: moderator inspectors author 3 Collection Defect Collection Form Roles Activities Products author 4 Correction Defect Correction Form Corrected Artifact 2

Why Inspection? A long history of research & application shows that structured human inspection is one of the most cost-effective practices for achieving quality software: Cost savings rule : Cost to find & fix software defects is about 100x more expensive after delivery than in early lifecycle phases, for certain types of defects. IBM: 117:1 between code and use Toshiba: 137:1 between pre- and post-shipment Data Analysis Center for Software: 100:1 Inspection effectiveness rule : Reviews and inspections find over 50% of the defects in an artifact, regardless of the lifecycle phase applied. 50-70% across many companies (Laitenberger) 64% on large projects at Harris GCSD (Elliott) 60% in PSP design/code reviews (Roy) 50-95%, rising with increased discipline (O Neill) many others 3

Problem Statement System development is often decomposed to handle complexity. Software increasingly plays a larger role in the system System Sub- System Sub- System Sub- System Software Hardware Software Complex electronics Research on system hazards in NASA s Constellation Program revealed that 51% of the hazards contained at least one software cause [Basili et al., 2010] but it is still just one part of the system Assurance activities are often conducted independently. Domain knowledge may affect quality of activities. Need a more integrated approach inspection across the system. For each inspection, consider a holistic view of the system. 4

Our proposed approach Research goal: Provide guidance for teams on planning and conducting inspections across a system. Non-intrusive Cost-effective Adaptable Philosophy: Package best practices, including adapting principles from software engineering. Our context is inspections of highly critical systems But should be generalizable to other domains. Health Check Inspection Process Assessment Methodology 5

The Process Health Check Assess the current inspection process standards and policies against practice. Provide best practices and guidelines for defining an inspection process. Identify areas that could benefit from recommendation. 6

The Process Health Check organizer Artifact 1 Planning Planning Form Assists with integrating an inspection into the larger system or CE lifecycle Used during project planning inspector Detection 2 Defect Report Form Has implications for how inspection preparation is carried out moderator inspectors author 3 Collection Defect Collection Form Roles Activities Products author 4 Correction Defect Correction Form Corrected Artifact 7

Methodology Overview Create baseline of best practices. Package best practices in a framework. Continuously refine framework: Proof of concept study. Pilot Study Deployment of the approach. 8

Building Baseline Sources Understand the practices for system inspections: Sources: NASA, DOD, ESA standards and handbooks System engineering literature. Well known software best practices NASA, ESA, DOD, RUP, literature Source re-elaboration: Understanding the real issues and needs System is different from software Definition of a common taxonomy Different standards can use different taxonomies Gathering and merging best practices Different standards and practices can propose different solutions 9

Building A Baseline Triggering Questions What techniques do people use to review system/software quality issues during development? Which artifacts serve as input to these techniques? Which techniques account for both systems and software? How do system engineers and software engineers participate in each other s activities? Should they participate in each other s activities? How? When? Is there any similarity between software inspections and system reviews? How can our knowledge and experiences in software inspection help to improve the system review process? 10

Exploring Interactions between Software and System Reviews are Key Decision Points in both system and software development. Reference models allow us to define system and software reviews that: Reason about types of information and how it is encapsulated in documentation at various phases What s available as input? Understand issues of timing, coordination, and communication across subsystems How do we assure that future activities can be done correctly? 11

Formulating Recommendations For each review type, reference models allow us to reason about: Structure of the review Team composition and expertise. Amount of material to inspect. Meeting length. Artifacts to be inspected Type and notation of documents. Quality attributes Mandatory and optional attributes. Which expertise should be checking which qualities. Which artifacts are appropriate for checking various qualities. 12

Formulating Recommendations For each review type, reference models allow us to reason about: Structure of the review Team composition and expertise. Amount of material to inspect. Meeting length. These parameters have been shown to affect effectiveness of (software) inspection. There are heuristics available. Did they stand the test of time? 13

Formulating Recommendations Inspection Structure Our recommendations are tested against a database of inspection results from across NASA centers. 2500+ inspections 5 Centers We unified, scrubbed, and verified the data Sparseness: Not all inspections collected our metrics of interest E.g. 721 reported # inspectors E.g. 627 reported page rate Outliers: We retained extreme values that used same definition of the metrics, if not of an inspection E.g. Page rates of hundreds of pages per hour E.g. Meeting length of less than 30 minutes Defect data is sensitive Raw data can be used by us but cannot be shared with other teams 14

Formulating Recommendations Inspection Structure Work at NASA in the mid-90s by Dr. John Kelly identified heuristics for key parameters (moderator s control metrics), e.g.: Team size: Too small miss important expertise Too large drive up costs, dampen discussion => Rule of thumb = 4 to 6 Page rate: Too small miss interrelations Too large thorough review impossible => Rule of thumb = 10 to 30 pgs for reqts, 20 to 40 pages for test plans, etc. Our database confirms that heuristics are still good predictors of inspections with most defects found. Team size: Avg results for all projects: If followed: 14 defects detected If not: 7 defects detected Significant, p<0.0005 Yet, fewer projects are able to follow them: Team size: 10% of contemporary projects followed Page rate: Avg results for all projects: If followed: 14 defects detected If not: 6.5 defects detected Significant, p<0.0005 Page rate: 15% of contemporary projects followed 15

Formulating Recommendations Inspection Structure Page rate = 20 Original heuristic (avg = 15.4) Page rate = 40 Maximize number of defects (avg = 13.1) Maximize defects found per hour Design 16

Packaging Best Practices as Assessment Process Assessment questions and (best practice/recommendation) answers about: Development and review process. Development model, amount of material to inspect, meeting length. Review team Team composition and expertise. Artifacts to be inspected and produced Type and notation of documents. Inspection metrics Quality attributes Mandatory and optional attributes. Which expertise should be checking which qualities. Which artifacts are appropriate for checking various qualities. Context questions: understand the need for tailoring of the best practices. Assessment questions to tie the recommendations to project context development process, etc. 17

Health Check Process An Informal Model 2. Examines Process Documents 1. Provides 2. Consults 5. Consults 3. Asks follow up questions 20 sets of Q s & A s Structures, artifacts, Quality attributes 5. Examines Feedback 4. Gives Red flags (i.e. deviation from expectation) may lead to: -Recommendations to the inspection process -Updates to the health check Q-A s 18

Health Check Process Example of Assessment Question High-level question: Who are the team members that are generally required to participate in a review of a particular artifact? Best practice recommendation: In most types of reviews, an inspection team should represent at least the following perspectives: requirements/user, integration and implementation, quality and process assurance Detailed-level/probing questions (if mismatch occurs): If a recommended team member is missing from the actual review team, what is the reason for this omission? Who performs his/her tasks in the actual review team? If a member of the actual review team is missing from our recommended team composition, why is this particular member needed? Who performs his/her tasks in the recommended review team? 19

Proof of Concept Application of Health Check Applied with NASA team developing safety-critical hardware interlocks. Assessment Process: Step 1 :Team sends us process documentation. Development and assurance process. Step 2: Gather answers to the health check questions, and compare them against the expected answers. Step 3: Ask follow-up questions Formulate recommendations. Step 4: Analyze feedback. 20

Proof of Concept Application of Health Check Recommendations: Issue 1: No inspection is req. in requirements phase Recommendation: A review should be performed during requirements phase, perhaps based on our SRR checklists Issue 2: V&V Matrix is only constructed during design phase. Recommendation: V&V matrix is based on requirements. It is a valuable artifact for SRR. Move its development earlier in the lifecycle. Issue 3: Development and evolution of test plan is not clear. Recommendation: Test plan is valuable artifact for every type of review. Test plan could be created in the early lifecycle phases. Issue 4: SRD and SSRD are input to the design and implementation phase, but no change or request document are shown as outputs Recommendation: It is beneficial to be open to look for requirement problems even in the later phases of development. Note explicitly constraints that disallow changes to such documents. Process deficiency Forwarded Process error Forwarded Documentation error Fixed Process deficiency Fixed 21

Future and Ongoing Work (1) Further validate and refine our approaches: Reaching out to teams who would be interested in applying health check and providing feedback. Currently work with a NASA team looking at certification review from both software and hardware side. Further extend our approaches for inspecting complex electronic applications. Understand the interface between CE and System. Understand which phase of CE is more closely related to software and which phase is more related to hardware. 22

Ongoing Work (2) Expand best practices recommendations to other V&V technologies Assess trade-offs of each V&V technique and formulate an assurance strategy based on combination and/or sequences of techniques. 23

Acknowledgement This work was sponsored by a grant from NASA s Software Assurance Research Program (SARP), Inspections for Systems and Software. Contact us: Madeline Diep mdiep@fc-md.umd.edu Forrest Shull fshull@fc-md.umd.edu 24