FUNCTIONAL VERIFICATION COVERAGE MEASUREMENT AND ANALYSIS This page intentionally left blank FUNCTIONAL VERIFICATION COVERAGE MEASUREMENT AND ANALYSIS by Andrew Piziali Verisity Design, Inc. KLUWER ACADEMIC PUBLISHERS NEW YORK,BOSTON, DORDRECHT, LONDON, MOSCOW eBookISBN: 1-4020-8026-3 Print ISBN: 1-4020-8025-5 ©2004 Kluwer Academic Publishers NewYork, Boston, Dordrecht, London, Moscow Print ©2004 Kluwer Academic Publishers Boston All rights reserved No part of this eBook maybe reproducedortransmitted inanyform or byanymeans,electronic, mechanical, recording, or otherwise,withoutwritten consent from the Publisher Created in the United States of America Visit Kluwer Online at: http://kluweronline.com and Kluwer's eBookstoreat: http://ebooks.kluweronline.com Table of Contents Foreword ix Preface xiii Introduction 1 1. The Language of Coverage 5 2. Functional Verification 15 2.1. Design IntentDiagram 16 2.2. Functional Verification 17 2.3. Testing versus Verification 19 2.4. Functional Verification Process 19 2.4.1. Functional Verification Plan 20 2.4.2. Verification Environment Implementation 26 2.4.3. Device Bring-up 27 2.4.4. Device Regression 28 2.5. Summary 30 3. Measuring Verification Coverage 31 3.1. Coverage Metrics 31 3.1.1. Implicit Metrics 32 3.1.2. Explicit Metrics 33 3.1.3. Specification Metrics 33 3.1.4. Implementation Metrics 34 3.2. Coverage Spaces 34 3.2.1. Implicit Implementation Coverage Space 35 3.2.2. Implicit Specification Coverage Space 35 3.2.3. Explicit Implementation Coverage Space 36 3.2.4. Explicit Specification Coverage Space 37 3.3. Summary 38 4. Functional Coverage 39 4.1. Coverage Modeling 39 4.2. Coverage Model Example 40 4.3. Top-Level Design 44 4.3.1. Attribute Identification 45 4.3.2. Attribute Relationships 50 4.4. Detailed Design 61 4.4.1. What to Sample 62 4.4.2. Where to Sample 65 4.4.3. When to Sample and Correlate Attributes 66 4.5. Model Implementation 67 4.6. Related Functional Coverage 75 4.6.1. Finite StateMachine Coverage 75 4.6.2. Temporal Coverage 76 4.6.3. Static Verification Coverage 77 4.7. Summary 78 5. Code Coverage 79 5.1. Instance and Module Coverage 79 5.2. Code Coverage Metrics 80 5.2.1. Line Coverage 80 5.2.2. Statement Coverage 81 5.2.3. Branch Coverage 82 5.2.4. Condition Coverage 84 5.2.5. Event Coverage 84 5.2.6. Toggle Coverage 85 5.2.7. Finite State Machine Coverage 85 5.2.8. Controlled and Observed Coverage 88 5.3. Use Model 89 5.3.1. Instrument Code 89 5.3.2. Record Metrics 90 5.3.3. Analyze Measurements 90 5.4. Summary 95 6. Assertion Coverage 97 6.1. What Are Assertions? 97 6.2. MeasuringAssertion Coverage 102 6.3. Open Verification Library Coverage 103 6.4. Static Assertion Coverage 104 6.5. AnalyzingAssertion Coverage 104 6.5.1. Checker Assertions 105 6.5.2. Coverage Assertions 106 6.6. Summary 107 vi Functional Verification Coverage Measurement and Analysis 7. Coverage-Driven Verification 109 7.1. Objections to Coverage-Driven Verification 110 7.2. Stimulus Generation 112 7.2.1. Generation Constraints 113 7.2.2. Coverage-Directed Generation 115 7.3. Response Checking 120 7.4. Coverage Measurement 122 7.4.1. Functional Coverage 123 7.4.2. Code Coverage 124 7.4.3. Assertion Coverage 126 7.4.4. Maximizing Verification Efficiency 127 7.5. Coverage Analysis 129 7.5.1. Generation Feedback 129 7.5.2. Coverage Model Feedback 130 7.5.3. Hole Analysis 131 7.6. Summary 136 8. Improving Coverage Fidelity With Hybrid Models 139 8.1. Sample Hybrid Coverage Model 140 8.2. Coverage Overlap 147 8.3. Static Verification Coverage 149 8.4. Summary 150 Appendix A:e Language BNF 151 Index 193 “Table of Contents” vii This page intentionally left blank Foreword As the complexity of today’s ASIC and SoC designs continues to increase, the challenge of verifying these designs intensifies at an even greater rate. Advances in this discipline have resulted in many sophisticated tools and approaches that aid engineers in verifying complex designs. How- ever, the age-old question of when is the verification job done, remains one of the most difficult questions to answer. And, the process of measuring verifi- cation progress is poorly understood. For example, consider automatic random stimulus generators, model- based test generators, or even general-purpose constraint solvers used by high-level verification languages (such as e). At issue is knowing which por- tions of a design are repeatedly exercised from the generated stimulus — and which portions of the design are not touched at all. Or, more fundamentally, exactly what functionality has been exercised usingthese techniques. Histor- ically, answering these questions (particularly for automatically generated stimulus) has been problematic. This challenge has led to the development of various coverage metrics to aid in measuring progress, ranging from code coverage (used to identify unexercised lines of code) to contemporary func- tional coverage (used to identify unexercised functionality). Yet, even with the development of various forms of coverage and new tools that support coverage measurement, the use of these metrics within the verification flow tends to be ad-hoc, which is predominately due to the lack of well-defined, coverage-driven verification methodologies. Prior to introducing a coverage-driven verification methodology, Func- tional Verification Coverage Measurement and Analysis establishes a sound foundation for its readers by reviewing an excellent and comprehensive list of terms that is common to the language of coverage. Building on this knowl- edge, the authordetails various forms of measuring progress that have histor- ically been applicable to a traditional verification flow, as well as new forms applicable to a contemporary verification flow.
Description: