Table Of ContentTesting Safety-Related Software
Springer
London
Berlin
Heidelberg
New York
Barcelona
Hong Kong
Milan
Paris
Santa Clara
Singapore
Tokyo
Stewart Gardiner (Ed.)
Testing
Safety-Related
Software
A Practical Handbook
With 39 Figures
Springer
Stewart N. Gardiner, BSc, PhD, CEng
Ernst and Young, Management Consultants, George House,
50 George Square, Glasgow, G2 IRR, UK
No representation or warranty, express or implied, is made or given by or on behalf of the editor
and the contributors or any of their respective directors or affiliates or any other person as to the
accuracy, completeness or fairness of the information, opinions or feasibility contained in this book
and this book should not be relied upon by any third parties who should conduct their own
investigations of this book and the matters set out herein and furthermore no responsibility or
liability in negligence or otherwise is accepted for any such information, opinions or feasibility and
the editor and contributors shall not be liable for any indirect or consequential loss caused by or
arising from any such information, opinions or feasibility being loss of profit and/or loss of
production. The contributors are: BAeSEMA Limited, G P Elliot Electronic Systems Limited,
Lloyd's Register, Lucas Aerospace Limited, Rolls Royce Industrial Controls Limited, Nuclear
Electric Limited, Rolls Royce pic, Scottish Nuclear Limited and The University of Warwick.
ISBN-J3: 978-1-85233-034-7 e-ISBN-J3: 978-1-4471-3277-6
DOl: 10.1007/978-1-4471-3277-6
British Library Cataloguing in Publication Data
Testing safety-related software : a practical handbook
1. Computer software -Testing 2. Computer software
Reliability 3. Industrial safety -Computer programs
I. Gardiner, Stewart
005.1'4
ISBN-13: 978-1-85233-034-7
library of Congress Cataloging-in-Publication Data
Testing safety-related software : a practical handbook 1 Stewart
Gardiner, ed.
p. em.
Includes bibliographical references and index.
ISBN-J3: 978-1-85233-034-7
1. Computer software - Testing. 2. Industrial safety - Software
-Testing.. I. Gardiner, Stewart, 1945-
QA76.76.T48T476 1998 98-34276
620.8'6'028514-dc21 CIP
Apart from any fair dealing for the purposes of research or private study, or criticism or review, as
permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced,
stored or transmitted, in any form or by any means, with the prior permission in writing of the
publishers, or in the case of reprographic reproduction in accordance with the terms oflicences issued by
the Copyright Licensing Agency. Enquiries concerning reproduction outside those terms should be sent
to the publishers.
© Springer-Verlag London Limited 1999
The use of registered names, trademarks etc. in this publication does not imply, even in the absence of a
specific statement, that such names are exempt from the relevant laws and regulations and therefore free
for general use.
The publisher makes no representation, express or implied, with regard to the accuracy of the
information contained in this book and cannot accept any legal responsibility or liability for any errors or
omissions that may be made.
Typesetting: Camera ready by editor
34/3830-543210 Printed on acid-free paper
Acknowledgements
This book is based upon technical reports produced by a number of authors
and is based upon the results of a collaborative research project (CONTESSE)
to which many persons contributed. The project was partly funded by the UK
Department of Trade and Industry (DTI) and the Engineering and Physical
Sciences Research Council (EPSRC) and was carried out between 1992 and
1995.
The editor gratefully acknowledges the following contributions to the
book:
Preparation of the final book:
K Czachur (Lloyds' Register)
K Khondar (University of Warwick)
R. Lowe, M. Mills and A MacKenzie
(BAeSEMA)
Authors of the technical reports that form the basis of the book:
BAeSEMA Dr. S. Gardiner
R. Lowe
A MacKenzie
H.Morton
S. Thomson
G P-Elliot Electronic Systems M. Ashman
KHomewood
D. Marshall
A Smith
J. Tennis
Lloyd's Register S. Allen
K Czachur
K Lee
Prof. C. MacFarlane
(University of Strathc1yde)
B. McMillan (University of Strathc1yde)
R. Melville (Brown Brothers)
vi Testing Safety-Related Software: A Practical Handbook
Lucas Aerospace A Ashdown
D. Hodgson
NEI Control Systems Dr. J. Parkinson
A Rounding
Nuclear Electric Dr. I. Andrews
G.Hughes
D.Pavey
Prof. P. Hall (The Open University)
Dr. J. May (The Open University)
Dr. H. Zhu (The Open University)
Dr. A D. Lunn (The Open University)
Rolls-Royce M. Beeby
T.Cockram
S. Dootson
N.Hayes
Dr.J. Kelly
P. Summers
Dr. E. Williams
Prof. A Burns (University of York)
Dr. D. Jackson (Formal Systems(Europe»
Dr. M. Richardson (University of York)
Scottish Nuclear I. O'Neill
P.Pymm
The University of Warwick Dr. F. Craine
Prof. J. Cullyer
K. Khondkar
Dr. N. Storey
The helpful advice of Tony Levene, the Project Monitoring officer for the DTI,
is acknowledged.
Crown Copyright is reproduced with the permission of the Controller of
Her Majesty's Stationery Office.
Extracts from mc standards are reproduced with permission under licence
no. BSI\PD\19981028. Complete editions of the standards can be obtained
by post through national standards bodies.
Camera-ready copy was created by Syntagma, Falmouth.
Stewart Gardiner - CONTESSE project manager
(Now with Ernst and Young, Management Consultancy
Services.)
July 1998
Contents
1 Introduction ................................................................................................... 1
1.1 Context.................................................................................................. 1
1.2 Audience............................................................................................... 2
1.3 Structure ............................................................................................... 3
1.4 Applicable Systems............................................................................. 4
1.5 Integrity Levels.................................................................................... 5
1.6 Typical Architectures.......................................................................... 5
1.7 The Safety Lifecyc1e and the Safety Case......................................... 17
1.8 Testing Issues across the Development Lifecyc1e........................... 18
1.9 Tool Support ........................................................................................ 22
1.10 Current Industrial Practice ................................................................ 23
1.11 The Significance Placed upon Testing
by Standards and Guidelines ............................................................ 29
1.12 Guidance............................................................................................... 31
2 Testing and the Safety Case ......................................................................... 33
2.1 Introduction ......................................................................................... 33
2.2 Safety and Risk Assessment .............................................................. 34
2.3 Hazard Analysis .................................................................................. 35
2.4 The System Safety Case ...................................................................... 41
2.5 Lifecycle Issues .................................................................................... 45
2.6 Guidance............................................................................................... 54
3 Designing for Testability ............................................................................. 59
3.1 Introduction ......................................................................................... 59
3.2 Architectural Considerations ............................................................ 61
3.3 PES Interface Considerations ............................................................ 62
3.4 Implementation Options and Testing Attributes ........................... 64
3.5 Software Features................................................................................ 76
viii Testing Safety-Related Software: A Practical Handbook
3.6 Guidance............................................................................................... 82
4 Testing of TIming Aspects ........................................................................... 83
4.1 Introduction ......................................................................................... 83
4.2 . Correctness of Timing Requirements............................................... 84
4.3 Scheduling Issues ................................................................................ 86
4.4 Scheduling Strategies.......................................................................... 89
4.5 Calculating Worst Case Execution Times........................................ 94
4.6 Guidance ............................................................................................... 100
5 The Test Environment................................................................................... 101
5.1 Introduction ......................................................................................... 101
5.2 Test Activities Related to the Development of a Safety Case ....... 102
5.3 A Generic Test Toolset........................................................................ 104
5.4 Safety and Quality Requirements for Test Tools ............................ 107
5.5 Statemate .............................................................................................. 110
5.6 Requirements and Traceability Management (RTM) .................... 113
5.7 AdaTEST ............................................................................................... 116
5.8 Integrated Tool Support ..................................................................... 121
5.9 Tool Selection Criteria ........................................................................ 121
5.10 Guidance............................................................................................... 123
6 The Use of Simulators .................................................................................. 125
6.1 Introduction ......................................................................................... 125
6.2 Types of Environment Simulators .................................................... 126
6.3 Use of Software Environment Simulation
in Testing Safety-Related Systems .................................................... 128
6.4 Environment Simulation Accuracy
and its Assessment Based on the Set Theory Model...................... 132
6.5 Justification of Safety from Environment Simulation.................... 139
6.6 Guidance............................................................................................... 141
7 Test Adequacy ................................................................................................ 143
7.1 Introduction ......................................................................................... 143
7.2 The Notion of Test Adequacy ........................................................... 143
7.3 The Role of Test Data Adequacy Criteria ........................................ 144
7.4 Approaches to Measurement of Software Test Adequacy............ 147
7.5 The Use of Test Data Adequacy........................................................ 152
7.6 Guidance............................................................................................... 154
Contents ix
8 Statistical Software Testing ......................................................................... 155
8.1 Introduction ......................................................................................... 155
8.2 Statistical Software Testing and Related Work. .............................. 156
8.3 Test Adequacy and Statistical Software Testing............................. 157
8.4 Environment Simulations in Dynamic Software Testing.............. 159
8.5 Performing Statistical Software Testing........................................... 160
8.6 The Notion of Confidence in Statistical Software Testing ............ 166
8.7 Criticisms of Statistical Software Testing ........................................ 167
8.8 The Future of Statistical Software Testing....................................... 168
8.9 Guidance............................................................................................... 170
9 Empirical Quantifiable Measures of Testing ........................................... 171
9.1 Introduction ......................................................................................... 171
9.2 Test Cost Assessment ......................................................................... 171
9.3 Test Regime Assessment .................................................................... 177
9.4 Discussion of Test Regime Assessment ModeL............................. 188
9.5 Evidence to Support the Test Regime Assessment ModeL.......... 191
9.6 Guidance............................................................................................... 194
References ........................................................................................................... 195
Appendix A Summary of Advice from the Standards ............................ 201
Bibliography ....................................................................................................... 213
Index. .................................................................................................................... 219
Chapter 1
Introduction
As software is very complex, we can only test a limited range of the possible states of
the software in a reasonable time frame.
In 1972, Dijkstra [1] claimed that 'program testing can be used to show the pres
ence of bugs, but never their absence' to persuade us that a testing approach alone is
not acceptable. This frequently quoted statement represented our knowledge about
software testing at that time, and after over 25 years intensive practice, experiment
and research, although software testing has been developed into a validation and ver
ification technique indispensable to software engineering discipline, Dijkstra's state
ment is still valid.
To gain confidence in the safety of software based systems we must therefore assess
both the product and the process of its development. Testing is one of the main ways
of assessing the product, but it must be seen, together with process assessment, in the
context of an overall safety case. This book provides guidance on how to make best use
of the limited resources available for testing and to maximise the contribution that
testing of the product makes to the safety case.
1.1 Context
The safety assurance of software based systems is a complex task as most fail
ures stem from design errors committed by humans. To provide safety assur
ance, evidence needs to be gathered on the integrity of the system and put
forward as an argued case (the safety case) that the system is adequately safe.
It is effectively impossible to produce software based systems that are com
pletely safe, as there will always be residual errors in the software which
have the potential to cause failures that cause hazards. It is also important to
build systems that are affordable. A judgement needs to be made on the cost
incurred in ensuring that the system is safe.
Testing is one of the main ways of assessing the integrity (safety) of a soft
ware based system. Exhaustive testing of the software is all but impossible as
the time taken to gain a credible estimate of its failure rate is excessive except
for systems with the lower levels of safety integrity requirement. To gain con
fidence in the safety of a software based system both the product (the system)
and the process of its development need to be assessed. Littlewood provides
S. N. Gardiner (ed.), Testing Safety-Related Software
© Springer-Verlag London Limited 1999