ebook img

arc-vm: an architecture real options complexity-based valuation methodology for military systems ... PDF

289 Pages·2011·28.29 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview arc-vm: an architecture real options complexity-based valuation methodology for military systems ...

ARC-VM: AN ARCHITECTURE REAL OPTIONS COMPLEXITY-BASED VALUATION METHODOLOGY FOR MILITARY SYSTEMS-OF-SYSTEMS ACQUISITIONS A Thesis Presented to The Academic Faculty by Jean Charles Domer¸cant In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the School of Aerospace Engineering Georgia Institute of Technology December 2011 Copyright (cid:13)c 2011 by Jean Charles Domer¸cant ARC-VM: AN ARCHITECTURE REAL OPTIONS COMPLEXITY-BASED VALUATION METHODOLOGY FOR MILITARY SYSTEMS-OF-SYSTEMS ACQUISITIONS Approved by: Dimitri N. Mavris, Professor Brian J. German, Assistant Professor Advisor and Committee Chair School of Aerospace Engineering School of Aerospace Engineering Georgia Institute of Technology Georgia Institute of Technology Vitali Volovoi, Assistant Professor Mrs. Kelly Cooper School of Aerospace Engineering Office of Naval Research Georgia Institute of Technology Santiago Balestrini-Robinson, Ph.D. Date Approved: 14 November 2011 School of Aerospace Engineering Georgia Institute of Technology To my family, my friends, and to Doc. iii ACKNOWLEDGEMENTS The development of this thesis has been both rewarding and challenging. Without the aid of my family and friends, this effort would not have been as fulfilling. First and foremost, I would like to thank my mother, my brother, and my aunt for their steadfast dedication and support. Their belief in me was sometimes the only thing that sustained me through the rough patches. I would also like to thank all of my friends near and far for their continued encouragement and enthusiasm. This includes all of my friends and colleagues at ASDL, both past and present, with special mention to the ARCHITECT team. Special thanks to Joseph Iacobucci, Kelly Griendling, Burak Bagdatli, Annie Jones, and Daniel Cooksey for their valuable feedback, much needed criticism, and positive support. It has been both an honor and a pleasure being a part of the ARCHITECT team. I would also like to express gratitude to my committee, Professors Vitali Volovoi and Brian German, Dr. Santiago Balestrini-Robinson, and Mrs. Kelly Cooper. Mrs. Cooper’s sponsorship and perspective, in particular, proved invaluable. I also owe much gratitude to Dr. Balestrini-Robinson for his wise council over the many months it took for this thesis to take shape, and for providing an important sounding board to my ideas. Last, but by no means least, I would like to thank my committee chairman and advisor Dr. Dimitri Mavris. Over the past five years I have come to admire not only his vast engineering knowledge, but also his passion and dedication in developing scholars. Though he always kept it challenging, he also always managed to remind us that this process can be as fun as it is important. I will always be grateful for the chances you have given me, and thank you again for helping me to develop both professionally and personally. iv TABLE OF CONTENTS DEDICATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii ACKNOWLEDGEMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv LIST OF TABLES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi LIST OF ABBREVIATIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . xvi SUMMARY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx I INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 II BACKGROUND . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.1 Defense Acquisition Decision Support Systems . . . . . . . . . . . . 11 2.1.1 Defense Acquisition Overview . . . . . . . . . . . . . . . . . 11 2.1.2 JCIDS: Joint Capabilities Integration Development System . 14 2.1.3 PPBE: Planning, Programming, Budgeting & Execution . . 15 2.1.4 Acquisition Life Cycle . . . . . . . . . . . . . . . . . . . . . 17 2.2 Department of Defense Architecture Framework . . . . . . . . . . . 19 2.2.1 DoDAF Development Timeline . . . . . . . . . . . . . . . . 19 2.2.2 DoDAF V2.0 . . . . . . . . . . . . . . . . . . . . . . . . . . 21 2.3 Ongoing Challenges to Successful Acquisitions . . . . . . . . . . . . 25 2.3.1 Cost, Schedule & Performance Tradeoffs . . . . . . . . . . . 25 2.3.2 Past Acquisition Reform Efforts . . . . . . . . . . . . . . . . 27 2.3.3 Weapons Systems Acquisition Reform Act of 2009 . . . . . 28 2.4 Analysis of Alternatives . . . . . . . . . . . . . . . . . . . . . . . . 30 2.4.1 AoA & Evolutionary Acquisition Strategy . . . . . . . . . . 30 2.4.2 Effectiveness & Cost Analyses . . . . . . . . . . . . . . . . . 33 2.4.3 Cost-Effectiveness Comparisons . . . . . . . . . . . . . . . . 38 v III RESEARCH FOCUS . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3.1 The Impact of Pre-Milestone A Decisions . . . . . . . . . . . . . . 45 3.2 Seeds of Failure Planted During Pre-Milestone A . . . . . . . . . . 47 3.3 Research Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 IV RESEARCH QUESTIONS . . . . . . . . . . . . . . . . . . . . . . . . . 60 4.1 Research Question #1: Measuring SoS Architecture Complexity . . 60 4.2 Research Question #2: Developing a Valuation Framework . . . . . 61 V MEASURING ARCHITECTURE COMPLEXITY . . . . . . . . . . . . 62 5.1 Defining a Complex System . . . . . . . . . . . . . . . . . . . . . . 62 5.1.1 Complex System Classification . . . . . . . . . . . . . . . . 64 5.1.2 Overview of Elementary Cellular Automata . . . . . . . . . 70 5.1.3 Complex System Definition . . . . . . . . . . . . . . . . . . 75 5.2 Complexity in Relation to Parsimony & Perception . . . . . . . . . 76 5.3 Approaches to Measuring Complexity . . . . . . . . . . . . . . . . 81 5.4 Measurement Criteria . . . . . . . . . . . . . . . . . . . . . . . . . 84 5.5 Existing System Complexity Measures . . . . . . . . . . . . . . . . 85 5.5.1 Abstraction Based Complexity Management . . . . . . . . . 85 5.5.2 Object-Process Model Based Complexity Measures . . . . . 90 5.5.3 Suh’s Axiomatic Design . . . . . . . . . . . . . . . . . . . . 92 5.6 Architecture Complexity Sub-Measures . . . . . . . . . . . . . . . . 95 5.6.1 Functional Distribution Complexity . . . . . . . . . . . . . . 97 5.6.2 Functional Process Complexity . . . . . . . . . . . . . . . . 108 5.6.3 Resource State Complexity . . . . . . . . . . . . . . . . . . 112 5.6.4 Resource Processing Complexity . . . . . . . . . . . . . . . 128 5.7 Defining the Measurement Framework . . . . . . . . . . . . . . . . 133 5.7.1 Measurement Theory . . . . . . . . . . . . . . . . . . . . . . 134 5.7.2 Multiattribute Utility Functions . . . . . . . . . . . . . . . . 136 5.7.3 Sub-measure Independence . . . . . . . . . . . . . . . . . . 141 vi 5.7.4 Architecture Complexity Measurement Framework . . . . . 144 VI DETERMINING ACQUISITION VALUE . . . . . . . . . . . . . . . . . 146 6.1 Overview of Financial Valuation Methods . . . . . . . . . . . . . . 147 6.2 Real Options for Strategic Decision Making . . . . . . . . . . . . . 149 6.3 The Tomato Garden: Luehrman’s Real Option Space . . . . . . . . 151 6.4 Developing the Acquisition Option Space . . . . . . . . . . . . . . . 156 6.4.1 Mapping Option Variables to Acquisition Projects . . . . . 156 6.4.2 Effectiveness & Time-Valued Capability . . . . . . . . . . . 156 6.4.3 Cumulative Variance . . . . . . . . . . . . . . . . . . . . . . 158 6.4.4 Risk & Probability of Program Success . . . . . . . . . . . . 159 6.4.5 Scaling & Discounting Architectural Complexity . . . . . . 166 6.4.6 The Acquisition Option Space . . . . . . . . . . . . . . . . . 170 6.5 High Level ARC-VM Summary & Overview . . . . . . . . . . . . . 179 VII VALUATION OF SEAD ARCHITECTURES USING ARC-VM . . . . . 186 7.1 Step 1: Define Capability Requirements . . . . . . . . . . . . . . . 186 7.2 Step 2: Define Alternative System Portfolios . . . . . . . . . . . . . 191 7.3 Step 3: Generate Feasible Architecture Alternatives . . . . . . . . . 191 7.3.1 SoS Collaboration Categories . . . . . . . . . . . . . . . . . 193 7.3.2 Design Space of Alternative SoS Architectures . . . . . . . . 195 7.4 Step 4: Develop M&S Environments and Perform Cost Analyses . . 199 7.4.1 SEAD Mission Scenario Development . . . . . . . . . . . . . 199 7.4.2 Phase I M&S: Modeling the Effects of Collaboration . . . . 200 7.4.3 Phase II M&S: Simplified Engagement Model Development 213 7.4.4 M&S Results . . . . . . . . . . . . . . . . . . . . . . . . . . 216 7.4.5 Life Cycle Cost Analysis . . . . . . . . . . . . . . . . . . . . 219 7.5 Step 5: Calculate Architecture Complexity & Specify ROA Inputs . 222 7.6 Step 6: Conduct Analysis of Alternatives . . . . . . . . . . . . . . . 227 vii VIII SUMMARY & CONCLUSIONS . . . . . . . . . . . . . . . . . . . . . . 234 8.1 Recommendations for Future Work . . . . . . . . . . . . . . . . . . 237 APPENDIX A SEAD M&S OUTPUT DATA . . . . . . . . . . . . . . . . 240 REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 VITA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268 viii LIST OF TABLES 1 COSYSMO Cost Drivers. . . . . . . . . . . . . . . . . . . . . . . . . . 56 2 Resource State Specifier Examples. . . . . . . . . . . . . . . . . . . . 118 3 Top Secret Homogeneous Resource State Space Needlines. . . . . . . 119 4 Homogeneous Resource State Space Needlines. . . . . . . . . . . . . . 119 5 Summary of Network Measures For a Simple Network. . . . . . . . . 125 6 Computer Network Example Resource State Complexities. . . . . . . 127 7 Military SoS Interoperability Hierarchy Levels. . . . . . . . . . . . . . 130 8 Classification of Scales of Measurement. . . . . . . . . . . . . . . . . . 137 9 SEAD Activities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186 10 SEAD Systems & Number of Supported Functions. . . . . . . . . . . 187 11 Available System to Function Mapping. . . . . . . . . . . . . . . . . . 189 12 Characterization of SEAD Needlines Using Resource State Specifiers. 190 13 Interoperability Levels for Blue SEAD Force. . . . . . . . . . . . . . . 190 14 Alternative SEAD System Portfolios. . . . . . . . . . . . . . . . . . . 191 15 Aircraft & Satellite Independent Sensor Coverage Factors. . . . . . . 203 16 IOL to Reliability Constant Mappings. . . . . . . . . . . . . . . . . . 206 17 M&S Inputs for Red IADS. . . . . . . . . . . . . . . . . . . . . . . . 214 18 M&S Inputs for Blue SEAD Force (Portfolios 1 & 2). . . . . . . . . . 214 19 M&S Inputs for Blue SEAD Force (Portfolio 3). . . . . . . . . . . . . 215 20 Red Force Structure DOE. . . . . . . . . . . . . . . . . . . . . . . . . 215 21 SEM Run Execution Summary. . . . . . . . . . . . . . . . . . . . . . 216 22 Example Force Structure Legend Interpretation. . . . . . . . . . . . . 216 23 AOR/JOA Engagement Performance Summary for Blue SEAD SoS Alternatives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221 24 Estimated Life Cycle Costs. . . . . . . . . . . . . . . . . . . . . . . . 222 25 SEAD FDC & FPC Scores. . . . . . . . . . . . . . . . . . . . . . . . 223 26 ARC-VM Data & Calculations for SEAD AOR/JOA Alternatives. . . 226 ix 27 Output Data for Portfolio 1 Alternatives. . . . . . . . . . . . . . . . . 241 28 Output Data for Portfolio 2 Alternatives. . . . . . . . . . . . . . . . . 242 29 Output Data for Portfolio 3 Alternatives. . . . . . . . . . . . . . . . . 243 30 Output Data for Portfolio 3 Alternatives (continued). . . . . . . . . . 244 31 Output Data for Portfolio 3 Alternatives (continued). . . . . . . . . . 245 32 Output Data for Portfolio 3 Alternatives (continued). . . . . . . . . . 246 33 Output Data for Portfolio 3 Alternatives (continued). . . . . . . . . . 247 34 Output Data for Portfolio 3 Alternatives (continued). . . . . . . . . . 248 35 Alternative 1 Resource Processing Matrix. . . . . . . . . . . . . . . . 249 36 Alternative 2 Resource Processing Matrix. . . . . . . . . . . . . . . . 250 37 Alternative 3 Resource Processing Matrix. . . . . . . . . . . . . . . . 251 38 Alternative 4 Resource Processing Matrix. . . . . . . . . . . . . . . . 252 39 Alternative 5 Resource Processing Matrix. . . . . . . . . . . . . . . . 253 40 Alternative 6 Resource Processing Matrix. . . . . . . . . . . . . . . . 254 x

Description:
2.1.2 JCIDS: Joint Capabilities Integration Development System . 14. 2.1.3 PPBE: Planning .. quisition Option Space (EB = 0.75, ϵB = 0.1) Also, the rise of transnational terrorism means that Irregular War- fare (IW) .. ing of the acquisition workforce to improving the contractual relationships bet
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.