Exploratory and Experience Based Testing

Similar documents
Two Futures of Software Testing

Generating Test Cases From Use Cases

Deploying Agile Practices in Organizations: A Case Study

Empirical Software Evolvability Code Smells and Human Evaluations

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

Visit us at:

TU-E2090 Research Assignment in Operations Management and Services

The Nature of Exploratory Testing

Software Security: Integrating Secure Software Engineering in Graduate Computer Science Curriculum

Team Dispersal. Some shaping ideas

Implementing a tool to Support KAOS-Beta Process Model Using EPF

Requirements-Gathering Collaborative Networks in Distributed Software Projects

Towards a Collaboration Framework for Selection of ICT Tools

IT4305: Rapid Software Development Part 2: Structured Question Paper

BSP !!! Trainer s Manual. Sheldon Loman, Ph.D. Portland State University. M. Kathleen Strickland-Cohen, Ph.D. University of Oregon

Different Requirements Gathering Techniques and Issues. Javaria Mushtaq

The Good Judgment Project: A large scale test of different methods of combining expert predictions

Evaluation of Usage Patterns for Web-based Educational Systems using Web Mining

Evaluation of Usage Patterns for Web-based Educational Systems using Web Mining

A Coding System for Dynamic Topic Analysis: A Computer-Mediated Discourse Analysis Technique

Success Factors for Creativity Workshops in RE

Institutionen för datavetenskap. Hardware test equipment utilization measurement

EECS 571 PRINCIPLES OF REAL-TIME COMPUTING Fall 10. Instructor: Kang G. Shin, 4605 CSE, ;

CONCEPT MAPS AS A DEVICE FOR LEARNING DATABASE CONCEPTS

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Pragmatic Use Case Writing

An Introduction to Simio for Beginners

Pair Programming: When and Why it Works

International Business BADM 455, Section 2 Spring 2008

Inside the mind of a learner

Welcome to the session on ACCUPLACER Policy Development. This session will touch upon common policy decisions an institution may encounter during the

A Context-Driven Use Case Creation Process for Specifying Automotive Driver Assistance Systems

BUILD-IT: Intuitive plant layout mediated by natural interaction

Execution Plan for Software Engineering Education in Taiwan

SOFTWARE EVALUATION TOOL

Early Warning System Implementation Guide

Commanding Officer Decision Superiority: The Role of Technology and the Decision Maker

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

Practice Examination IREB

Essentials of Rapid elearning (REL) Design

Pod Assignment Guide

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Infrared Paper Dryer Control Scheme

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

Education the telstra BLuEPRint

Fragment Analysis and Test Case Generation using F- Measure for Adaptive Random Testing and Partitioned Block based Adaptive Random Testing

The College of Law Mission Statement

PAST EXPERIENCE AS COORDINATION ENABLER IN EXTREME ENVIRONMENT: THE CASE OF THE FRENCH AIR FORCE AEROBATIC TEAM

Litterature review of Soft Systems Methodology

A GENERIC SPLIT PROCESS MODEL FOR ASSET MANAGEMENT DECISION-MAKING

Seminar - Organic Computing

Objectives. Chapter 2: The Representation of Knowledge. Expert Systems: Principles and Programming, Fourth Edition

What is a Mental Model?

COURSE INFORMATION. Course Number SER 216. Course Title Software Enterprise II: Testing and Quality. Credits 3. Prerequisites SER 215

Programme Specification

Thesis-Proposal Outline/Template

Improving software testing course experience with pair testing pattern. Iyad Alazzam* and Mohammed Akour

Minna Lakkala, Liisa Ilomäki, Sami Paavola, Kari Kosonen and Hanni Muukkonen

Expert Reference Series of White Papers. Mastering Problem Management

COMPETENCY-BASED STATISTICS COURSES WITH FLEXIBLE LEARNING MATERIALS

Specification of the Verity Learning Companion and Self-Assessment Tool

Unit purpose and aim. Level: 3 Sub-level: Unit 315 Credit value: 6 Guided learning hours: 50

Cognitive Thinking Style Sample Report

What s in a Step? Toward General, Abstract Representations of Tutoring System Log Data

November 17, 2017 ARIZONA STATE UNIVERSITY. ADDENDUM 3 RFP Digital Integrated Enrollment Support for Students

Lecturing Module

Learning Methods for Fuzzy Systems

White Paper. The Art of Learning

On the Combined Behavior of Autonomous Resource Management Agents

St. Martin s Marking and Feedback Policy

Evaluating Collaboration and Core Competence in a Virtual Enterprise

Data Structures and Algorithms

License to Deliver FAQs: Everything DiSC Workplace Certification

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

Briefing document CII Continuing Professional Development (CPD) scheme.

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments

A Model to Detect Problems on Scrum-based Software Development Projects

Student User s Guide to the Project Integration Management Simulation. Based on the PMBOK Guide - 5 th edition

Accounting & Financial Management

Guru: A Computer Tutor that Models Expert Human Tutors

1. Programme title and designation International Management N/A

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Course Specification Executive MBA via e-learning (MBUSP)

Abstractions and the Brain

HEPCLIL (Higher Education Perspectives on Content and Language Integrated Learning). Vic, 2014.

A Game-based Assessment of Children s Choices to Seek Feedback and to Revise

Developing the Right Test Documentation

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document.

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

THESIS GUIDE FORMAL INSTRUCTION GUIDE FOR MASTER S THESIS WRITING SCHOOL OF BUSINESS

The open source development model has unique characteristics that make it in some

Feature-oriented vs. Needs-oriented Product Access for Non-Expert Online Shoppers

University-Based Induction in Low-Performing Schools: Outcomes for North Carolina New Teacher Support Program Participants in

Data Modeling and Databases II Entity-Relationship (ER) Model. Gustavo Alonso, Ce Zhang Systems Group Department of Computer Science ETH Zürich

Software Maintenance

Job Hunting Skills: Interview Process

Ministry of Education, Republic of Palau Executive Summary

Developing Software Testing Courses for Your Staff

CSC200: Lecture 4. Allan Borodin

Course Syllabus It is the responsibility of each student to carefully review the course syllabus. The content is subject to revision with notice.

Transcription:

Exploratory and Experience Based Testing 21.11.2011 Juha Itkonen Aalto University School of Science Department of Computer Science and Engineering

Contents Intelligent Manual Testing Experience base testing Exploratory testing Ways of Exploring Session Based Test Management Touring testing Intelligent Manual Testing Practices Examples of empirically identified testing practices Benefits of Experience Based Testing 2

Manual Testing Testing that is performed by human testers Research has shown: 1. Individual differences in testing are high 2. Test case design techniques alone do not explain the results Stereotype of manual testing Executing detailed pre-designed test cases Mechanically following the step-bystep instructions Treated as work that anybody can do In practice, it s clear that some testers are better than others in manual testing and more effective at revealing defects... 3 Image: Salvatore Vuono

Traditional emphasis on test documentation Test case design and documentation over emphasized Both in textbooks and research Test cases make test designs tangible, reviewable, and easy to plan and track i.e. manage and control In many contexts test cases and other test design documentation are needed The level and type of test documentation should vary based on context 4

Experience is invaluable in software testing Domain experience Knowledge and skills gained in the application domain area How the system is used in practice, and by whom What are the goals of the users How is the system related to the customer s (business) processes Technical system experience How the system was built What are the typical problems and defects How is the system used and all the details work How things work together and interact Testing experience Knowledge of testing methods and techniques Testing skills grown in practice 5

Software testing is creative and exploratory work requires skills and knowledge application domain users processes and objectives some level of technical details and history of the application under test requires certain kind of attitude 6

Tester s Attitude People tend to see what they want or expect to see If you want to show that software works correctly, you will miss defects Tester s goal is to break the software Reveal all relevant defects Find out any problems real users would experience in practice Testing is all about exceptions, special cases, invalid inputs, error situations, and complicated unexpected combinations Photo by Arvind Balaraman 7

Tester s Goal Explore, investigate, and measure Provide quality related information for other stakeholders in useful form Testers attitude is destructive towards the software under test, but highly constructive towards people Photo by Graur Codrin 8

My viewpoint: Experience Based Intelligent Manual Testing Manual testing that builds on the tester s experience knowledge and skills Some aspects of testing rely on tester s skills during testing e.g., input values, expected results, or interactions Testers are assumed to know what they are doing Testing does not mean executing detailed scripts Focus on the actual testing work in practice What happens during testing activities? How are defects actually found? Experience-based and exploratory aspects of software testing 9

Exploratory Testing is creative testing without predefined test cases Based on knowledge and skills of the tester 1. Tests are not defined in advance Exploring with a general mission without specific step-by-step instructions on how to accomplish the mission 2. Testing is guided by the results of previously performed tests and the gained knowledge from them Testers can apply deductive reasoning to the test outcomes 3. The focus is on finding defects by exploration Instead of demonstrating systematic coverage 4. Parallel learning of the system under test, test design, and test execution 5. Experience and skills of an individual tester strongly affect effectiveness and results 10

Document driven vs. exploratory testign A B C Tests A C B Tests 11

Exploratory Testing is an approach Most of the testing techniques can be used in exploratory way Exploratory testing and (automated) scripted testing are the ends of a continuum Freestyle exploratory bug hunting High level test cases Pure scripted (automated) testing Chartered exploratory testing Manual scripts 12

Lateral thinking Allowed to be distracted Find side paths and explore interesting areas Periodically check your status against your mission 13

Scripted vs. Exploratory Tests Mine-field analogy bugs fixes 14

Two views of agile testing extreme Testing Automated unit testing Developers write tests Test first development Daily builds with unit tests always 100% pass Functional (acceptance) testing Customer-owned Comprehensive Repeatable Automatic Timely Public Focus on automated verification enabling agile software development Exploratory Testing Utilizes professional testers skills and experience Optimized to find bugs Minimizing time spent on documentation Continually adjusting plans, refocusing on the most promising risk areas Following hunches Freedom, flexibility and fun for testers Focus on manual validation making testing activities agile 15

Contents Intelligent Manual Testing Experience base testing Exploratory testing Ways of Exploring Session Based Test Management Touring testing Intelligent Manual Testing Practices Examples of empirically identified testing practices Benefits of Experience Based Testing 16

Some ways of exploring in practice Freestyle exploratory testing Unmanaged ET Functional testing of individual features Exploring high level test cases Exploratory regression testing by verifying fixes or changes Session-based exploratory testing Exploring like a tourist Outsourced exploratory testing Advanced users, strong domain knowledge Beta testing 17

Session Based Test Management (SBTM) Bach, J. "Session-Based Test Management", STQE, vol. 2, no. 6, 2000. http://www.satisfice.com/articles/sbtm.pdf Charter Time Box Reviewable Result Debriefing 18

Session-Based Testing a way to manage ET Enables planning and tracking exploratory testing Without detailed test (case) designs Dividing testing work in small chunks Tracking testing work in time-boxed sessions Efficient no unnecessary documentation Agile it s easy to focus testing to most important areas based on the test results and other information Changes in requirements, increasing understanding, revealed problems, identified risks, Explicit, scheduled sessions can help getting testing done when resources are scarce When testers are not full-time testers... 19

Exploring like a tourist a way to guide ET sessions Touring tests use tourist metaphor to guide testers actions Focus to intent rather than separate features This intent is communicated as tours in different districts of the software James A. Whittaker. Exploratory Software Testing, Tips, Tricks, Tours, and Techniques to Guide Test Design. Addison-Wesley, 2009. 20

Districts and Tours Business district Guidebook tour Money tour Landmark tour Intellectual tour FedEx tour After-hours tour Garbage collector s tour Historical district Bad-Neighborhood tour Museum tour Prior version tour Entertainment district Supporting actor tour Back alley tour All-nighter tour Tourist district Collector s tour Lonely businessman tour Supermodel tour TOGOF tour Scottish pub tour Hotel district Rained-out tour Coach potato tour Seedy district Saboteur tour Antisocial tour Obsessive-compulsive tour 21 James A. Whittaker. Exploratory Software Testing, Tips, Tricks, Tours, and Techniques to Guide Test Design. Addison-Wesley, 2009.

Examples of exploratory testing tours The Guidebook Tour Use user manual or other documentation as a guide Test rigorously by the guide Tests the details of important features Tests also the guide itself Variations Blogger s tour, use third party advice as guide Pundit s tour, use product reviews as guide Competitor s tour The Garbage Collector s Tour Choosing a goal and then visiting each item by shortest path Screen-by-screen, dialog-bydialog, feature-by-feature, Test every corner of the software, but not very deeply in the details The All-Nighter Tour Never close the app, use the features continuously keep software running keep files open connect and don t disconnect don t save move data around, add and remove sleep and hibernation modes... 22

Contents Intelligent Manual Testing Experience base testing Exploratory testing Ways of Exploring Session Based Test Management Touring testing Intelligent Manual Testing Practices Examples of empirically identified testing practices Benefits of Experience Based Testing 23

Empirically observed practices from industry Testing, not test case pre-design Practices work on different levels of abstraction Many practices are similar to traditional test case design techniques Many practices are similar to more general testing strategies, heuristics, or rules of thumb 24

Overall strategies Structuring testing work Guiding a tester through features Detailed techniques Low level test design Defect hypotheses Checking the test results 25

Overall strategies Exploring weak areas Exploratory Aspect oriented testing User interface exploring Top-down functional exploring Simulating a real usage scenario Documentation based Data as test cases Exploring highlevel test cases Checking new and changed features Smoke testing by intuition and experience 26

Detailed techniques Testing input alternatives Testing alternative ways Exploring against old functionality Input Testing boundaries and restrictions Covering input combinations Exploratory Simulating abnormal and extreme situations Testing to-andfrom the feature Comparing with another application or version Persistence testing Comparing within the software Feature interaction testing Comparison Checking all the effects Defect based exploring End-to-end data check 27

Basic Objectives in Testing Activities Exploring: Guiding tester through the functionality Coverage: Selecting what gets tested and what not Oracle: Deciding if the results are correct Risks: Detecting specific types of defects Prioritization: Selecting what to test first 28

<exploratory strategy> Exploring weak areas Description: Exploring areas of the software that are weak or risky based on the experience and knowledge of the tester. Goal: Reveal defects in areas that are somehow known to be risky. Focus testing on risky areas. complicated coded in a hurry lots of changes coders' opinion testers' opinion based on who implemented a hunch... 29

<exploratory strategy> Top-down functional exploring Description: Proceeding in testing by first going through typical cases and simple checks. Proceed gradually deeper in the details of the tested functionality and applying more complicated tests. Goal: To get first high level understanding of the function and then deeper confidence on its quality set-by-step. Is this function implemented? Does it do the right thing? Is there missing functionality? Does it handle the exceptions and special cases? Does is work together with the rest of the system? How about error handling and recovery 30

<documentation based strategy> Using data as test cases Description: Pre-defined test data set includes all relevant cases and combinations of different data and situations. Covering all cases in a predefined test data set provides the required coverage. Testing is exploratory, but the pre-defined data set is used to achieve systematic coverage. Suitable for situations where data is complex, but operations simple. Or when creating the data requires much effort. Goal: To manage exploratory testing based on pre-defined test data. Achieve and measure coverage in exploratory testing. Example: Different types of customers in a CRM system. User privileges Situation, services, relationships History, data 31

<comparison technique> Comparing within the software Description: Comparing similar features in different places of the same system and testing their consistency. Goal: Investigating and revealing problems in the consistency of functionality inside a software; help decide if a feature works correctly or not. 32

<input technique> Testing to-and-from the feature Description: Test all things that affect to the feature Test all things that get effects from the feature Goal: Systematically cover the feature s interactions. Reveal defects that are caused by a not-the-mostobvious relationship between the tested feature and other features or environment. 33

Ways of utilizing IMT Practices Training testers Guiding test execution Test documentation and tracking Test patterns for different situations 34

Contents Intelligent Manual Testing Experience base testing Exploratory testing Ways of Exploring Session Based Test Management Touring testing Intelligent Manual Testing Practices Examples of empirically identified testing practices Benefits of Experience Based Testing 35

Strengths of experience based testing Testers skills Utilizing the skills and experience of the tester Testers know how the software is used and for what purpose Testers know what functionality and features are critical Testers know what problems are relevant Testers know how the software was built Risks, tacit knowledge Enables creative exploring Enables fast learning and improving testing Investigating, searching, finding, combining, reasoning, deducting,... Testing intangible properties Look and feel and other user perceptions 36

Strengths of experience based testing Process Agility and flexibility Easy and fast to focus on critical areas Fast reaction to changes Ability to work with missing or weak documentation Effectiveness Reveals large number of relevant defects Efficiency Low documentation overhead Fast feedback 37

Challenges of experience based testing Planning and tracking How much testing is needed, how long does it take? What is the status of testing? How to share testing work between testers? Managing test coverage What has been tested? When are we done? Logging and reporting Visibility outside testing team or outside individual testing sessions Quality of testing How to assure the quality of tester s work Detailed test cases can be reviewed, at least 38

Reasons for documenting test cases Optimizing Selecting optimal test set Avoiding redundancy Organization Organized so that tests can be reviewed and used effectively Selecting and prioritizing Repeatability Know what test cases were run and how; so that you can repeat the same tests Tracking What requirements, features, or components are tested What is the coverage of testing How testing proceeds? Are we going to make the deadline? Proof of testing Evaluating the level of confidence How do we know what has been tested? 39

Detail level of test cases Experienced testers need less detailed test cases More experienced as testers More familiar with the software and application domain Input conditions Depends on the testing technique and goals of testing E.g. if goal is to cover all pairs of certain input conditions, the test cases have to be more detailed than in scenario testing Expected results More detail is required if the result is not obvious, requires complicated comparison, etc. Inexperienced tester needs more guidance on what to pay attention to 40

Why should we document the expected outcome? The expected values explicitly define what is the correct result Important if the correct result is not obvious If expected values are not defined On the other hand... If provided The tester with wants detailed and expected expects to results, see the the correct tester behavior just The tester has less work if the software works correctly looks for the exact details pointed out and ignores everything The else tester many does unexpected not know the defects correct are behavior missed. Tester assumes the correct behavior Many defects are not found Looks OK to me! 41

Experimental Comparison of ET and Test Case Based Testing (TCBT) Itkonen, J., M. V. Mäntylä and C. Lassenius. "Defect Detection Efficiency: Test Case Based vs. Exploratory Testing", in proceedings of the International Symposium on Empirical Software Engineering and Measurement, pp. 61-70, 2007. Effectiveness in terms of revealed defects Test execution time was fixed No difference in effectiveness ET revealed more defects, but no statistical difference TCBT required much more effort Test case design before the test execution TCBT produced twice as many false reports than ET 42

Who tested my software? Mäntylä, M. V., Itkonen, J., Iivonen, J., "Who Tested My Software? Testing as an Organizationally Cross-Cutting Activity", Software Quality Journal, 2011. Testing is not an action that is solely performed by specialists. In all our cases, people in roles that vary from sales to software development found a substantial number of defects. Validation from the viewpoint of end-users was found more valuable than verification aiming at zero defect software. 43

Who tested my software? Developers defects had the highest fix rate and specialized testers defects had the lowest fix rate. People with a personal stake in the product (e.g., sales and consulting personnel) tend to place more importance on their defects, but it does not seem to improve their fix ratios. 44

The role of knowledge in failure detection Itkonen J., Mäntylä M. V., Lassenius C., "The Role of Knowledge in Failure Detection During Exploratory Software Testing", in review for IEEE Transactions on Software Engineering, 2011. Detailed analysis of 91 defect detection incidents form video recorded exploratory testing sessions Analysed what type of knowledge is required for detecting failures? Analysed failure detection difficulty 45

The role of knowledge in failure detection findings Knowledge utilization Testers are able to utilize their personal knowledge of the application domain, the users needs, and the tested system for defect detection. Side effect bugs In exploratory testing, testers frequently recognize relevant failures in a wider set of features than the actual target features of the testing activity. Obvious failures A large number of the failures in software applications and systems can be detected without detailed test design or descriptions. Application domain related failures are simple to reveal The majority of failures detected by domain knowledge are straightforward to recognize. 46

The role of knowledge in failure detection conclusions The ET approach could be effective even when less experienced testers are used. All testing does not need to be scripted or rigorously (pre)designed. A lot of benefits can be achieved by efficient exploring The ET approach is an effective way of involving the knowledge of domain experts in testing activities who do not have testing expertise 47

Questions and more discussion Contact information Juha Itkonen juha.itkonen@aalto.fi +358 50 577 1688 http://www.soberit.hut.fi/jitkonen 48

References (primary) Bach, J., 2000. Session-Based Test Management. Software Testing and Quality Engineering, 2(6). Available at: http://www.satisfice.com/articles/sbtm.pdf. Bach, J., 2004. Exploratory Testing. In E. van Veenendaal, ed. The Testing Practitioner. Den Bosch: UTN Publishers, pp. 253-265. http://www.satisfice.com/articles/et-article.pdf. Itkonen, J. & Rautiainen, K., 2005. Exploratory testing: a multiple case study. In Proceedings of International Symposium on Empirical Software Engineering. International Symposium on Empirical Software Engineering. pp. 84-93. Itkonen, J., Mäntylä, M.V. & Lassenius, C., 2007. Defect Detection Efficiency: Test Case Based vs. Exploratory Testing. In Proceedings of International Symposium on Empirical Software Engineering and Measurement. International Symposium on Empirical Software Engineering and Measurement. pp. 61-70. Itkonen, J., Mantyla, M. & Lassenius, C., 2009. How do testers do it? An exploratory study on manual testing practices. In Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. pp. 494-497. Itkonen, J., 2011. Empirical Studies on Exploratory Software Testing. Doctoral dissertation, Aalto University School of Science. http://lib.tkk.fi/diss/2011/isbn9789526043395/ Lyndsay, J. & Eeden, N.V., 2003. Adventures in Session-Based Testing. http://www.workroom-productions.com/papers/ AiSBTv1.2.pdf. Available at: http://www.workroom-productions.com/papers/aisbtv1.2.pdf. Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing in a Small Software Company. In Proceedings of International Conference on Software Engineering. International Conference on Software Engineering. pp. 602-611. Whittaker, J.A., 2009. Exploratory Software Testing: Tips, Tricks, Tours, and Techniques to Guide Test Design, Addison-Wesley Professional. 49

References (secondary) Agruss, C. & Johnson, B., 2005. Ad Hoc Software Testing. Ammad Naseer & Marium Zulfiqar, 2010. Investigating Exploratory Testing in Industrial Practice. Master's Thesis. Rönneby, Sweden: Blekinge Institute of Technology. Available at: http://www.bth.se/fou/cuppsats.nsf/all/8147b5e26911adb2c125778f003d6320/$file/mse-2010-15.pdf. Armour, P.G., 2005. The unconscious art of software testing. Communications of the ACM, 48(1), 15-18. Beer, A. & Ramler, R., 2008. The Role of Experience in Software Testing Practice. In Proceedings of Euromicro Conference on Software Engineering and Advanced Applications. Euromicro Conference on Software Engineering and Advanced Applications. pp. 258-265. Houdek, F., Schwinn, T. & Ernst, D., 2002a. Defect Detection for Executable Specifications An Experiment. International Journal of Software Engineering & Knowledge Engineering, 12(6), 637. Kaner, C., Bach, J. & Pettichord, B., 2002. Lessons Learned in Software Testing, New York: John Wiley & Sons, Inc. Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing in a Small Software Company. In Proceedings of International Conference on Software Engineering. International Conference on Software Engineering. pp. 602-611. Tinkham, A. & Kaner, C., 2003a. Learning Styles and Exploratory Testing. In Pacific Northwest Software Quality Conference (PNSQC). Pacific Northwest Software Quality Conference (PNSQC). Wood, B. & James, D., 2003. Applying Session-Based Testing to Medical Software. Medical Device & Diagnostic Industry, 90. Våga, J. & Amland, S., 2002. Managing High-Speed Web Testing. In D. Meyerhoff et al., eds. Software Quality and Software Testing in Internet Times. Berlin: Springer-Verlag, pp. 23-30. 50