Table Of ContentEngineering a sustainable
built environment
Tests for software accreditation
and verification
CIBSE TM33: 2006
Chartered Institution of Building Services Engineers
222 Balham High Road, London SW12 9BS
The rights of publication or translation are reserved.
No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any
means without prior permission.
© April 2006 The Chartered Institution of Building Services Engineers,
London SW12 9BS
Registered Charity Number 278104
ISBN-10: 1-903287-69-3
ISBN-13: 978-1-903287-69-9
This document is based on the best knowledge available at the time of publication. However no responsibility
of any kind for any injury, death, loss, damage or delay however caused resulting from the use of these
recommendations can be accepted by the Chartered Institution of Building Services Engineers, the authors or
others involved in its publication.
In adopting these recommendations for use each adopter by doing so agrees to accept full responsibility for
any personal injury, death, loss, damage or delay arising out of or in connection with their use by or on behalf
of such adopter irrespective of the cause or reason therefore and agrees to defend, indemnify and hold
harmless the above named bodies, the authors and others involved in their publication from any and all liability
arising out of or in connection with such use as aforesaid and irrespective of any negligence on the part of
those indemnified.
Typeset by CIBSEPublications
Note from the publisher
This publication is primarily intended to provide guidance to those responsible for the design,
installation, commissioning, operation and maintenance of building services. It is not intended to be
exhaustive or definitive and it will be necessary for users of the guidance given to exercise their own
professional judgement when deciding whether to abide by or depart from it.
Amendments
This version (issue 3) of TM33 has been amended to incorporate the following amendments.
Date Page Amendment
11/04/06 24 Table 2.29a added
13/04/06 25 Amendment to Table 2.31
13/04/06 33 Note added regarding listing of constructions
13/04/06 34 Amendments to Table 3.8
13/04/06 35 Amendment to Table 3.12
13/04/06 36 Amendment to Table 3.17
13/04/06 36 Amendments to Tables 3.18 and 3.19
5/05/06 4 Addition to listed standards for accreditation; references renumbered; note added
on incorrect results given in BS ENISO 13791
5/05/06 6 Amendment to Table 2.1
5/05/06 8 Amendments to Table 2.5, 2.6 and 2.7
5/05/06 24 Amendment to Table 2.29a
5/05/06 26 Amendment to Table 2.32
5/05/06 28 Amendments to input data for AHU3
5/05/06 29 Amendments to dates for test AHU3
5/05/06 29 Amendments to Tables 2.34 and 2.35
5/05/06 33 Amendments to Tables 3.4 and 3.5
5/05/06 34 Amendments to Tables 3.7, 3.8 and 3.9
5/05/06 35 Amendments to Tables 3.11 and 3.13
Foreword
In recent years there has been a significant growth in the market for software to support building services
design. These tools are now widely used for a range of tasks in the building services industry by
consultants and contractors. In 2002 the Building Services Research and Information Association,
supported by CIBSE and the Association of Consulting Engineers, published Design checks for HVAC*.
This quality control framework for building services design specifically addressed issues of verification
and validation of design data. It particularly noted the need for software packages to be validated prior to
use and warned of the possible professional indemnity implications of using software packages without
validation. In one example a consulting engineer was held to be 90% liable for design failures that occurred
due to errors in the design software used, which had not been checked or validated by the engineers.
This new edition arises from a need for the UK regulators to have a mechanism for the technical
accreditation of detailed thermal models as part of their formal approval for use in the National
Calculation Methodology. To do this it has been necessary to extend the range of the tests described in the
previous edition, in particular to include tests to predict annual heating and cooling demand and
overheating risk. The conversion of demands to energy consumption has also been taken into
consideration with one of the tests requiring the prediction of the performance of an air handling unit.
These tests do not however provide a ‘truth model’ and so to demonstrate that the models can give
credible results a test using the experimentally measured performance of a simple test cell has been added.
Further changes have been made necessary to ensure that where appropriate calculation methods meet the
relevant British (European) Standards.
The preparation of software tests is not a trivial task and the ground work carried out by the University of
Strathclyde in their development of the original set of tests has been invaluable. Similarly this document
could not have been written without the aid and patience of the software houses who ‘tested the tests’.
During this development period it became clear that to succeed with the simplest of tests required great
care and therefore the CIBSE recommends that the tests be used as part of the user training that is
necessary to satisfy the CIBSE QAprocedures described in chapter 5 of the 2006 edition of CIBSE Guide A.
Such a QAsystem is part of the requirement for using a calculation tool as part of the National Calculation
Methodology.
CIBSE market research on publications and design software suggests that many members and non-
members believe that the software they use accords with CIBSE methods. Some also believe the software
they use is accredited by CIBSE, although this is not the case. Section 4 is therefore devoted to tests
associated with CIBSE calculation methods. These are intended to provide a means by which members
can test for themselves that the software they use is producing results consistent with those produced by
CIBSE methods, and with good practice.
Software users will be able to test their software to assure themselves that it is consistent with published
CIBSE methods and practices. The tests will enable software users to carry out a range of basic checks on
the software they use, and to demonstrate that they have undertaken basic initial validation of the software
to quality assurance and professional indemnity insurance practitioners. This set of simple tests is
intended to develop a culture of software testing and validation in the industry. CIBSE intends to expand
and update the range of tests in the future.
Initial validation alone is not sufficient to demonstrate that use of a particular software package was
appropriate to the needs of a specific project. Accurate software is a prerequisite of, but does not guarantee,
design quality. Design quality is also a function of, amongst other things, the input data and assumptions
used, and of the way in which outputs from the software are used. It is always the responsibility of the
designer to ensure that whatever software design tools are adopted, they reflect and are appropriate to the
contractual obligations accepted in the appointment by the client. Further guidance on design quality is
given in Design checks for HVAC*.
* Lawrence Race G Design checks for HVAC — A quality control framework for building services EngineersBSRIA AG1/2002 (Bracknell:
Building Services Research and Information Association) (2002))
TM33 Task Group
Mike Holmes (Arup) (Chairman)
Chris Britton (Hoare Lee and Partners)
Ron De Caux (Roger Preston and Partners)
Gavin Davies (Arup)
Tim Dwyer (South Bank University)
Christopher Morbitzer (HLM Architects)
Caitriona Ni Riain (Max Fordham)
Foroutan Parand
Brian Spires (HLM Architects)
Contributors
Matthew Collin (Arup)
Gavin Davies (Arup)
Acknowledgements
Tony Baxter (Hevacomp Ltd.)
Martin Gough (Integrated Environmental Solutions Ltd. (IES))
Ian Highton (Environmental Design Solutions Ltd. (EDSL))
Alan Jones (Environmental Design Solutions Ltd. (EDSL))
The Institution is grateful to Dr Chris Martin (Energy Monitoring Company) for
permission to use test data for the empirical validation test (section 3).
This publication relies on material provided for the previous edition. The Institution
acknowledges the material provided by previous authors and contributors, including:
Iain Macdonald (Energy Systems Research Unit, University of Strathclyde), Paul
Strachan (Energy Systems Research Unit, University of Strathclyde) and Jon Hand
(Energy Systems Research Unit, University of Strathclyde).
Editor
Ken Butcher
CIBSE Editorial Manager
Ken Butcher
CIBSE Research Manager
Hywel Davies
CIBSE Publishing Manager
Jacqueline Balian
TM33 Task Group
Mike Holmes (Arup) (Chairman)
Chris Britton (Hoare Lee and Partners)
Ron De Caux (Roger Preston and Partners)
Gavin Davies (Arup)
Tim Dwyer (South Bank University)
Christopher Morbitzer (HLM Architects)
Caitriona Ni Riain (Max Fordham)
Foroutan Parand
Brian Spires (HLM Architects)
Contributors
Matthew Collin (Arup)
Gavin Davies (Arup)
Acknowledgements
Tony Baxter (Hevacomp Ltd.)
Martin Gough (Integrated Environmental Solutions Ltd. (IES))
Ian Highton (Environmental Design Solutions Ltd. (EDSL))
Alan Jones (Environmental Design Solutions Ltd. (EDSL))
The Institution is grateful to Dr Chris Martin (Energy Monitoring Company) for
permission to use test data for the empirical validation test (section 3).
This publication relies on material provided for the previous edition. The Institution
acknowledges the material provided by previous authors and contributors, including:
Iain Macdonald (Energy Systems Research Unit, University of Strathclyde), Paul
Strachan (Energy Systems Research Unit, University of Strathclyde) and Jon Hand
(Energy Systems Research Unit, University of Strathclyde).
Editor
Ken Butcher
CIBSE Editorial Manager
Ken Butcher
CIBSE Research Manager
Hywel Davies
CIBSE Publishing Manager
Jacqueline Balian
Tests for software accreditation and
verification
0 Introduction
The use of computer modelling for designing comfortable and energy efficient buildings is accelerating at
an ever increasing pace. From their origins in research and development 25 or more years ago, software
tools are now becoming commonplace. One of the most important questions that arises with the use of
software is: ‘How can users be certain of the quality of the tools and ensure an appropriate level of trust in
the results?’
This document attempts to address this issue by providing an approach that users can apply with their
software tools. The approach consists of a series of standard tests for building services design programs for
the purposes of:
— technical accreditation of detailed thermal models as part of obtaining formal approval for their
use in the National Calculation Methodology(1) which describes the additional steps needed for a
tool to become approved for use in demonstrating compliance with the Building Regulations in
England and Wales*
— verification that such programs produce results consistent with good practice as set out in the
methods in the CIBSE Guides.
The tests have been developed with the intention of finding a balance between comprehensiveness and
ease of application. The primary reason for the tests is to instil confidence in program users, rather than
providing a comprehensive validation of a program. The specific topics for which tests have been
developed were agreed with CIBSE’s Software Accreditation Group, the Office of the Deputy Prime
Minister (ODPM) and representatives of the software developer community. The tests are designed to
confirm that the programs give the correct output for a range of conditions and are not meant to be
exhaustive. It should be noted that the buildings described within the tests do not necessarily conform to
the standards set out in the current Approved Documents, Technical Handbooks or Technical Booklets,
which give guidance on the requirements of the building regulations.
Some of the tests cover program data, the rest cover the calculation of specific performance metrics such as
heating loads, interstitial condensation risk etc. Some of the tests are specific to CIBSE calculation
methods and consequently not appropriate for testing detailed thermal models. These tests are presented
in a separate section of this document. A test based upon monitored data also is included in a separate
section. It is intended that this will be a living document and that the number and extent of the tests may
be expanded and updated in the future.
The primary target audience for the tests is program users. They are also relevant to software vendors,
whose products are required to undergo the tests described in sections 1 and 2 as part of the accreditation
process required by the National Calculation Methodology. This methodology is defined by the Building
Regulations for England and Wales, and implements the provisions of Article 3 of the Energy
Performance in Buildings Directive. Section 3 describes tests for CIBSE-specific methods. Compliance
with the tests described in this section are not required as part of the third party software accreditation
process. It is expected that program developers will choose to embed the tests in their programs, and
provide the data files used for the tests to increase confidence in program use. The tests presented in this
document have been subjected to detailed third party review with a range of calculation programs.
The overall strategy has been to adopt the minimum level of complexity necessary. Several of the tests are
based on a simple one-zone building model; in particular, a single zone space with dimensions and
constructions specified in the forthcoming CEN cooling load standard(2). The model is then modified or
extended as necessary for specific tests. To ease the burden for users applying the tests, similar
constructions and climate sequences are used for several of the tests.
* In Northern Ireland Part F (Conservation of fuel and power) of the Building Regulations (Northern Ireland) applies. For Scotland,
at the time of publication, the Scottish Executive is considering proposals for amending the energy standards in the Building
(Scotland) Regulations 2004 and the supporting guidance provided in section 6 of the Technical Handbooks.
1
Tests for software accreditation and verification
For each test, there is statement of the purpose of the test, the categories of data covered, sources for the
tests, a test description, expected results, and an indication of acceptable tolerances. The tolerances
specified for each test are dependent on the particular test. In some cases, the tests simply request a
reporting of fixed values for which there should be no error. In other cases, a specified level of deviation
from the stated value is acceptable to account for reasonable variations within the tool in question. These
tolerances have been defined following:
— sensitivity studies
— feedback from third parties implementing the tests, and
— discussion with the CIBSE Software Accreditation Assessment Panel.
In some cases, the tests require the use of prescribed inputs (e.g. for surface heat transfer coefficients).
These may not always be appropriate in a design context (e.g. some simulation programs may calculate
more appropriate time-varying coefficients). Guidance is given on this topic in the 2006 edition of CIBSE
Guide A(3).
For those wishing to undertake more detailed program validation, there is also a large number of tests and
benchmarks available from CEN and the International Energy Agency (IEA).
The IEA Solar Heating and Cooling (SHC) and Energy Conservation in Buildings and Community
Systems (ECBCS) programmes have been developing and applying building energy analysis program test
methods since 1977. The framework for these tests has three main elements:
— Analytical verification tests: These involve analytical solutions for specific heat transfer processes
under prescribed boundary conditions.
— Inter-program comparative tests: These involve a series of diagnostic test cases applied to a number of
energy analysis programs.
— Empirical validation tests: These involve comparing program predictions with data from highly
instrumented test rooms or buildings. Although such tests offer a definitive ‘truth’ model, in
practice they are time consuming to apply and require very high quality monitored data.
The IEA tests available to date are set out in Table 0.1(4). ASHRAE Standing Standard Project Committee
140, the Netherlands Energy Diagnostic Reference, and Australia’s home energy rating and greenhouse
gas emission rating programs are all based on the IEA test cases.
Table 0.1 IEA validation tests
Program evaluation Test focus
test type
Building envelope Building equipment
Analytical tests Working document of HVAC BESTEST
IEA Task 22 Subtask A1 (E100–E200)
HVAC BESTEST(IEA
fuel-fired furnace)
Comparative tests IEA BESTEST HVAC BESTEST
(E300–E545)
Expanded ground- RADTESTradiant
coupling test cases heating
Empirical tests ETNA/GENEC tests Iowa ERS: VAV
BRE/DMU tests Iowa: daylighting, HVAC
Iowa: economiser
control
Table 0.2 below sets out selected other existing and forthcoming tests available for validating programs.
Note: the fact that a particular piece of software meets the requirements of this document implies only that
that software meets a minimum standard. CIBSE recommends that users have appropriate quality
management systems in place as described in section 5.3 of CIBSE Guide A(3)and CIBSE AM11(5). Such a
quality management system is part of the requirement for using a calculation tool as part of the National
Calculation Methodology.
2
Tests for software accreditation and verification
Table 0.2 Other validation test sets
Source Topic
ASHRAE RP-1052(6) A comprehensive test suite of analytical
tests
ASHRAE Standard 140(7) BESTEST inter-program comparison tests
BS EN ISO 13791: 2004(8) Calculation of internal temperatures of a
room in summer without mechanical
cooling; includes validation tests
CEN EN xxxx: 2006 Calculation of sensible room cooling load;
includes tests and example results
CEN EN xxxx: 2006 Calculation of energy use for space heating
and cooling; includes tests and example
results
3
Tests for software accreditation and verification
Table 4.26 Test C7: Internal and external environmental conditions
Month Internal External
DBT/ °C RH/ % psat/ % DBT/ °C WBT(sling) WBT(scrn) RH/ % psat/ %
/ °C / °C
January 20.0 57.0 56.4 –1.0 –1.7 –1.8 85.0 85.0
February 20.0 58.0 57.4 0.0 –0.8 –0.9 84.0 83.9
March 20.0 54.0 53.4 4.0 2.7 2.5 78.0 77.9
April 20.0 51.0 50.4 9.0 6.9 6.7 72.0 71.8
May 20.0 51.0 50.4 14.0 11.1 10.8 68.0 67.7
June 20.0 50.0 49.4 18.0 14.8 14.6 69.0 68.5
July 20.0 56.0 55.4 19.0 16.1 15.9 73.0 72.6
August 20.0 52.0 51.4 19.0 16.4 16.2 75.0 74.6
September 20.0 56.0 55.4 15.0 13.1 12.9 79.0 78.7
October 20.0 57.0 56.4 10.0 8.7 8.6 83.0 82.6
November 20.0 57.0 56.4 5.0 4.3 4.2 88.0 87.9
December 20.0 59.0 58.5 1.0 0.4 0.3 88.0 88.1
Note:DBT= dry bulb temperature; RH= relative humidity; psat= percentage saturation; WBT(sling) = wet bulb
temperature (sling); WBT(scrn) = wet bulb temperature (screen)
4.7.5 Results
Results for the first test are displayed in Table 4.27 and for the second test in Table 4.28. Note that the
output format follows the recommendation in the ISO standard that the first month reported is the one in
which condensation first appears.
Table 4.27 Test C7: Condensation test results Table 4.28 Test C7:Condensation test results
without vapour barrier with vapour barrier
Month Condensation Month Condensation
Rate Monthly Accumulation Rate Monthly Accumulation
/ g·m–2·h–1 total / g·m–2 / g·m–2 / g·m–2·h–1 total / g·m–2 / g·m–2
November 0.02 14.6 14.6 December 0.06 41.0 41.0
December 0.12 93.3 107.9 January 0.06 44.1 85.1
January 0.13 100.0 207.9 February 0.05 33.0 118.1
February 0.12 81.9 289.8 March –0.07 –52.8 65.3
March –0.03 –22.5 267.3 April –0.25 –178.8 0.0
April –0.25 –180.5 86.7 May 0.00 0.0 0.0
May –0.50 –375.0 0.0 June 0.00 0.0 0.0
June 0.00 0.0 0.0 July 0.00 0.0 0.0
July 0.00 0.0 0.0 August 0.00 0.0 0.0
August 0.00 0.0 0.0 September 0.00 0.0 0.0
September 0.00 0.0 0.0 October 0.00 0.0 0.0
October 0.00 0.0 0.0 November 0.00 0.0 0.0
4.7.6 Acceptable tolerances
The calculation is well specified. Therefore, any differences in results will be due to rounding errors.
Results should be within 0.01 g·m–2·h–1for the condensation rate and within 5 g·m–2for the totals.
References
1 National Calculation Methodology for the energy performance of buildings: The UK implementation of the requirements of the Energy
Performance of Buildings Directive(London: Office of the Deputy Prime Minister) (2006)
2 EN xxxx: 2006: Energy performance of buildings. Calculation of energy use for heating and cooling — General criteria and validation
procedures(Brussels: Comité Européen de Normalisation) (to be published)
3 Environmental designCIBSE Guide A (London: Chartered Institution of Building Services Engineers) (2006)
4 Judkoff R D and Neymark J S Adaptation of the BESTESTIntermodel Comparison Method for Proposed ASHRAE Standard
140P: Method of Test for Building Energy Simulation Programs ASHRAE Trans.105(2) 721–736 (1999)
5 Building energy and environmental modellingCIBSE Applications Manual AM11 Chartered Institution of Building Services
Engineers) (1998)
52
Tests for software accreditation and verification
6 Development of an analytical verifcation test suitable for whole building energy simulation programs — building fabric ASHRAE
Report RP-1052 (Atlanta, GA: American Society of Heating, Ventilating and Air-Conditioning Engineers) (2001)
7 Standard method of test for the evaluation of building energy analysis computer programsASHRAE Standard 140-2001 (Atlanta, GA:
American Society of Heating, Ventilating and Air-Conditioning Engineers) (2001)
8 BS EN ISO 13791: 2004: Thermal performance of buildings. Calculation of internal temperatures in room in summer without
mechanical cooling. General criteria and calculation procedures(Brussels: Comité Européen de Normalisation) (2004)
9 BS EN ISO 13792: 2004: Thermal performances of buildings. Internal temperatures of a room in summer without mechanical cooling.
Simplified methods(Brussels: Comité Européen de Normalisation) (1997)
10 prEN 15255: 2005: Thermal performance of buildings. Sensible room cooling calculation. General criteria and validation procedures
(draft) (Brussels: Comité Européen de Normalisation) (2005)
11 prEN 15265: 2005: Thermal performance of buildings. Calculation of energy use for space heating and cooling. General criteria and
validation procedures(draft) (June 2005)
12 BS EN 1745: 2002: Masonry and masonry products. Methods for determining design thermal values(London: British Standards
Institution) (2002)
13 BS EN 12524: 2000: Building materials and products. Hygrothermal properties. Tabulated design values (London: British
Standards Institution) (2002)
14 CIBSE/Met Office weather data sets(London: Chartered Institution of Building Services Engineers) (2002) (Note: these data
sets have been superseded but for the purposes of testing compliance with TM33, the earlier Test Reference Year and Design
Summer Year for London are available from CIBSE)
15 Environmental designCIBSE Guide A (London: Chartered Institution of Building Services Engineers) (1999)
16 Weather, solar and illuminance dataCIBSE Guide J (London: Chartered Institution of Building Services Engineers) (2001).
17 Duffie J A and Beckman W A Solar Engineering of Thermal Processes(New York, NY: Wiley) (1991)
18 BS EN ISO 6946: 1997: Building components and building elements. Thermal resistance and thermal transmittance. Calculation
method(London: British Standards Institution) (1997)
19 BS EN 410: 1998: Glass in building. Determination of luminous and solar characteristics of glazing(London: British Standards
Institution) (1998)
20 BS EN 673: 1998: Glass in Building. Determination of thermal transmittance (U-value). Calculation method. (London; British
Standards Institution) (1998)
21 Calculation of Energy and Environmental Performance of Buildings. Subtask B: Appropriate use of models International Energy
Agency Annex 21 — IEA Energy Conservation in Buildings and Community Systems and IEA Solar Heating and Cooling
Programme Task 12 (Paris: International Energy Agency) (1994)
22 Holmes M J The simulation of heating and cooling coils for performance analysis Proc. Conf. System Simulation in Buildings,
Liege (Belgium) 6–8 Dec 1982(1982)
23 Reference dataCIBSE Guide C (London: Chartered Institution of Building Services Engineers) (2001)
24 Calculation of Energy and Environmental Performance of Buildings. Subtask C: Empirical validation of thermal building simulation
ptrograms using test cell data International Energy Agency, Annex 21 — IEA Energy Conservation in Buildings and
Community Systems and IEA Solar Heating and Cooling Programme Task 12 (Paris: International Energy Agency) (1994)
25 BS EN ISO 13788: 2002: Hygrothermal performance of building components and building elements. Internal surface temperature to
avoid critical surface humidity and interstitial condensation. Calculation methods(London: British Standards Institution) (1997)
53