ebook img

DTIC ADA510557: Cost Effective VV&A: Get Credit for What You're Already Doing PDF

7 Pages·0.04 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview DTIC ADA510557: Cost Effective VV&A: Get Credit for What You're Already Doing

COST EFFECTIVE VV&A: GET CREDIT FOR WHAT YOU’RE ALREADY DOING Michelle L. Kilikauskas David H. Hall Joint Accreditation Support Activity SURVICE Engineering Company Naval Air Warfare Center 900E N. Heritage Drive Suite 1 Weapons Division Ridgecrest, CA 93555 1 Administration Circle China Lake, CA 93555 [email protected] [email protected] KEYWORDS INTRODUCTION Accreditation, Documentation, Verification, Many program managers in the defense Validation acquisition community think that verification, validation and accreditation ABSTRACT (VV&A) of models and simulations (M&S) is too hard, costs too much and takes too The vast majority of software engineers are long. Most of the time they are thinking of conscientious professionals who work hard extensive Independent Verification and to convince themselves that their software Validation (IV&V) efforts; the IV&V does the right thing and works correctly. practitioner rightly wants to make sure that But they almost never write down what they the M&S he or she supports have a well do so that they can convince someone else. documented, disciplined process of V&V The work they do is, in essence, verification applied to them. and validation. Part of the problem is that developers often don’t think of what they do However, in our experience most M&S as “V&V”, they may or may not be focused developers are already doing a lot of the on the M&S application requirements, and if right things in V&V, but: (1) those same it’s not documented then they can’t get developers don’t see much of what they’re credit for it, anyway. doing as “V&V”, so they under-report what they do. (2) Most developers don’t write This paper describes a cost-effective VV&A down what they did in any retrievable approach centered on the capability, fashion, or they leave out key pieces of accuracy and usability of M&S. This information. This is exacerbated by the approach focuses V&V activities on extensive use of personal computers, which accreditation requirements by formalizing an has nearly eliminated the use of engineering intended use statement by the ultimate M&S notebooks, and creative people often detest user. The approach also facilitates making documenting anyway. (3) Pressure to best use of activities that already are produce a product on time and within budget ongoing during the development of the often means that if program managers fund software. We identify some useful tips for any V&V activities, funding for low cost informal documentation of V&V documenting them falls off the table. information required to support an accreditation. Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 3. DATES COVERED 2002 2. REPORT TYPE 00-00-2002 to 00-00-2002 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Cost Effective VV&A: Get Credit for What You’re Already Doing 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION Joint Accreditation Support Activity,Naval Air Warfare Center,Weapons REPORT NUMBER Division, 1 Administration Circle,China Lake,CA,93555 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF ABSTRACT OF PAGES RESPONSIBLE PERSON a. REPORT b. ABSTRACT c. THIS PAGE Same as 6 unclassified unclassified unclassified Report (SAR) Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 We have helped a number of acquisition M&S requirements and acceptance criteria programs to develop a cost-effective VV&A are determined by a combination of approach, using the results of work that they application analysis and risk analysis. were already doing. All of these efforts did Application analysis, or use analysis require some additional effort, but we were establishes the specifics of how the in every case able to leverage previous work simulation will be used, and which extensively. Many of these customers were simulation functions and outputs are doing considerable V&V, but it was not important. Risk analysis establishes the focused on their application and it wasn’t level of credibility required, and which being documented. We have helped them to aspects of simulation credibility are essential focus their efforts on the requirements of to accreditation. From those two activities their application, and to document their we can determine V&V requirements. By ongoing V&V efforts in ways that comparing ongoing V&V activities with the minimized any additional effort while requirements for V&V, we can develop a capturing their critical V&V results. tailored VV&A plan specific to the application. When the planned VV&A ACCREDITATION PROCESS activities are completed, an accreditation recommendation can be made to the Figure 1 illustrates the process that the Joint Accreditation Authority, who makes the Accreditation Support Activity (JASA) uses final accreditation decision. A key element to support a program’s accreditation of of the process is to determine any limitations models and simulations (and sometimes, on the use of the simulation based on the physical simulators). documented results that are available, and any proposed workarounds in areas where FFooccuuss oonn MM&&SS UUssee AAAnnnaaalllyyyzzzeee AAAppppppllliiicccaaatttiiiooonnn the simulation may not be acceptable. Ultimately, the accreditation provides an DDDeeevvveeellloooppp MMM&&&SSS RRReeeqqquuuiiirrreeemmmeeennntttsss,,, AAAcccccceeeppptttaaannnccceee CCCrrriiittteeerrriiiaaa aaannnddd IIInnnfffooo RRReeeqqquuuiiirrreeemmmeeennntttsss assessment of the risks of using the simulation for the application. RRReeessseeeaaarrrccchhh MMMooodddeeelll CCChhhaaarrraaacccttteeerrriiissstttiiicccsss EEEvvvaaallluuuaaattteee MMM&&&SSS aaagggaaaiiinnnsssttt RRReeeqqquuuiiirrreeemmmeeennntttsss The key to making this process cost- PPPrrreeessseeennnttt AAAccccccrrreeedddiiitttaaatttiiiooonnn CCCaaassseee effective is to focus the required activities where they most make sense. In the first MMaakkee AAccccrreeddiittaattiioonn DDeecciissiioonn FFooccuuss oonn phases of the process, the focus needs to be AAccccrreeddiittaattiioonn on the intended use of the simulation; in Figure 1. Accreditation Process fact, the entire purpose of the application analysis and Intended Use Statement is to The nature, scope and depth of information develop M&S requirements and acceptance necessary to accredit a simulation depends criteria for the specific use; these will be the partly on what the simulation is being used criteria against which the M&S will be for and partly on how much risk is judged. In the last phases, the focus is on associated with its use for that purpose. accreditation: that is, does the accreditation Accreditation is a statement of confidence in authority have enough information to make the credibility of the simulation for a a decision that the M&S meets those specific application (it is not a statement of acceptance criteria? general credibility for any application whatsoever.) And that is where the rubber meets the road. Ideas for Increasing the Value of the Too often programs have done considerable Existing Record work to ensure that their simulation has the capability, accuracy and usability they 1. As a guiding principle, focus on content require, but they have not made sure that rather than aesthetics. With a limited their work has been documented. So they amount of time, it is much more important did the work, but they don’t get the credit, to write the information down clearly in a thus running the risk of not being accredited place where it can be retrieved easily than it by the ultimate user. is to get fancy. Write information by hand onto meeting agendas or onto a hardcopy What JASA has done for a number of of the viewgraphs from meetings. customers is to review their VV&A plans, results and especially their documentation of 2. Gather technical resume sheets for each VV&A activities. We have consistently person on the development team and keep it seen that programs are conscientiously with the team records. Include information tracking the credibility of their M&S tools. on the technical qualifications of each However, they have almost uniformly not person (training, experience) as well as their captured that work in documents, or even major areas of emphasis or responsibility in informally in lab notes, viewgraph developing the simulation. We suggest you presentations, etc. also include information on people who are referenced in your files or working notes. We have found that it often is not necessary The qualifications of the team members help to expend resources formally documenting to build confidence in the product. VV&A results. Informal documentation is often sufficient for the purpose of providing 3. Look through the briefings you have on evidence of M&S credibility, as long as it is hand, find incomplete references and fill in readily retrievable and traceable back to the the missing information. Write the M&S version used and the personnel information on a hardcopy of the viewgraph involved. The following are some tips on and keep it in the program files. For how to make those informal records example, if there is a reference to an author meaningful to an accreditation assessment. (Skolnik, for example), write down the title of the book or paper or internal memo and INCREASING THE VALUE OF the date. Noting the pages from that INFORMAL RECORDS reference that contain the relevant formulations would be a good practice. Based on our experience with a number of Particularly for internal memos, some hint programs, we have come up with some on where to get a copy of the reference simple ways to increase the value of any would be helpful. If you have the document, existing information you may have for make a copy of the cover or title page and supporting an accreditation decision, as well the relevant sections and put it in the file as some thoughts on how to better capitalize with the viewgraphs. on planned activities in the future. These ideas have been used successfully by a 4. For data plots which contained more than number of DOD acquisition programs over one trace, include a key which indicates the last several years. which line is what. If your team uses a consistent convention, (old version is solid line and new version is dotted line, for Almost Painless Ways to Increase the example) a note at the beginning of the Value of the Records You Generate from record that describes the plotting Here On Out conventions would be useful to outside observers. 1. Put the title, date, point of contact, and contact info (telephone, email address) on 5. Also, for plots in your existing record, go the cover of any viewgraph presentations or back and make some comments directly on reports. the plots about what the plot is showing, your analysis of what the plot means, and 2. Document the version of the code to any inferences you can make about the which the presentation applied and the implications for credibility or usefulness of version of any hardware components whose the simulation. If the plots show good behavior or characteristics are reflected in correlation between simulation predictions the code. and field data or expert opinion, or the plot indicates a problem that you followed up 3. Take a moment during informal review and corrected, this is important evidence of sessions to jot down the essence of the the correctness of the simulation that you discussion about the presentation. Were should get credit for. But, you don’t get there concerns about any of the charts credit for good correlation if people don’t shown? Questions? Is there anything in the know what they’re looking at. plots that doesn’t look right that needs follow-up? Was there consensus that the 6. Look over the records and jot on the design looked good, or the plots matched viewgraphs which version of the simulation what you expected, or there was good the information is related to and the correlation between the simulation approximate date the memo or viewgraph predictions and the reference data (test data, package or whatever was generated. intelligence estimates, whatever)? If it looks good, make a comment to that effect and 7. Look over the records and jot on the initial and date it. We suggest that the viewgraphs which version of the system others in the review also initial the hardcopy hardware components the module is related of the viewgraph. If something looks to (for example, is a model of the Inertial strange, mark it on the hardcopy and jot Measurement Unit (IMU) generic or is it down who is going to follow up. Keep a tied to a particular version of the real key in your records that has a list of team IMU?). You should describe the hardware members’ clearly legible names and a component in enough detail to uniquely sample of each person’s initials. For more identify it: serial number, delivery date, information on conducting effective label, etc. reviews, see also (Kilikauskas, 2002)*. 8. Also, jot down enough about test cases 4. Require an assumptions and caveats slide (initial conditions, test file name, etc.) that in presentations on design or results. It just the plot is meaningful to someone else. takes a moment to put it down on a slide, but * Kilikauskas, M.L. 2002. “Conducting An Effective Expert Review”, Briefing at Mini-MORS Workshop 2: Test & Evaluation, Modeling & Simulation and VV&A, Kirtland AFB, NM (October) it may be impossible to recreate the and modification and transformation information later. information stays with the code. 5. Keep a logbook of team meetings. Jot 9. In any briefings on changes to a baseline down notes of who was there, topics of simulation, a slide or two on the discussion, activities (peer review of design implications for simulation use would be a fixes for the IMU, etc.) major findings, and very nice addition to the description of the action items. This helps establish a pattern changes. What do you expect to be the of informal review over time. effect of the change on usefulness or credibility of the simulation? If one were to 6. Software engineers generally show day- compare results of the previous version with to-day results to each other informally for a the new version for the same input sanity check outside of a review meeting. conditions, what differences would you This is a form of peer review. When one expect to see in the major output parameters person looks over someone else’s design or of interest? code or output, have them jot their comments on the paper and date and initial 10. In briefings about software design or it. This may be particularly helpful for work results, require a slide or two which done by engineers with less experience. describes efforts made by the engineer to Outside observers generally feel better about determine that the conceptual design of the work done by less experienced team module meets the requirements, that the members if they see evidence that detailed design is consistent with the colleagues with more experience have been conceptual design, that the code actually giving them a hand. implements the design, and that there are no coding errors. 7. In briefings about module design or results, include a slide which summarizes 11. Keep a file of things presented at team who has who looked at the design or output meetings. Include agenda, attendance list plots so far, their comments/conclusions, (maybe put all team members’ names on the and why their opinion means anything. This agenda as a standard practice and, on the helps to build evidence of checks during the copy of the agenda you throw in the file, development process and preserves any pencil out the name of those who don’t comments or conclusions. attend that particular meeting), viewgraph packages with notes about any reaction, 8. Require that software engineers record discussion items, and decisions related to the the source of the algorithms and data they viewgraphs. use in developing individual software modules. If the engineer modified the data 12. Keep a library of the source materials after it was received from the original that are referenced in documents about the source, record how the data were changed simulation (design document and briefings, and why. This should be standard analyst’s or user’s manuals, etc.). At the information in a viewgraph presentation on very least keep a list of where to get them so the design or modification of a software that you can retrieve a copy if you need to. module. Putting this information in the code as comments ensures that the data source 13. It helps the team to be more consistent BIOGRAPHY in following these practices if you do a couple of things: Michelle L. Kilikauskas is the Director of a) Develop a template for briefings the Joint Accreditation Support Activity that includes place holders for the types of (JASA) at the Naval Air Warfare Center, slides mentioned (title with name, date and Weapons Division (NAWCWD) at China contract information; assumptions slide; Lake, California. She has served as the implications for model use; source of accreditation support agent for several algorithms and embedded data; etc.) international missile and military aircraft b) Choose a meticulous person and acquisition programs and was a participant put him or her in charge of taking notes in the International Test and Evaluation during the meetings and keeping a file. Be Steering Committee’s Working Group of kind to that person and reward him or her Experts on Verification and Validation. Ms. for doing this job for you. Kilikauskas is a frequent writer, speaker and c) The team lead needs to enforce teacher on the practical implementation of following these practices. Reward people VV&A policy, principles and methodology. for doing this, and punish them if they don’t (make them do it over till they include these David H. Hall is the Manager of SURVICE things). Engineering Company’s Ridgecrest Area Operation, under contract to the NAWCWD SUMMARY Survivability Division for analysis support services. Prior to his retirement from the JASA has supported a number of acquisition Government in January 2002, he was the programs with cost-effective accreditation Chief Analyst of the NAWCWD planning, application analysis to develop Survivability Division, head of the intended use statements and M&S Survivability Analysis Branches, and acceptance criteria, V&V, and interim JASA Director. From 1992 through documentation. We have demonstrated that 1996 he was also the Joint Project Manager you almost never have to start from scratch: of the Susceptibility Model Assessment and but you do have to document the results of Range Test (SMART) project, which V&V efforts that you’re already doing so developed and demonstrated Joint M&S that an accreditation decision can be well VV&A and configuration management substantiated. Informal documentation can processes for DOD. At SURVICE he be just as effective as formal reporting continues to participate in JASA VV&A standards, as long as the documents are support services to DOD systems acquisition retrievable, accessible, and meet the needs programs. He is a frequent speaker, author of the accreditation authority. When it and instructor on survivability and M&S comes time to accredit M&S, why not get VV&A. credit for what you’re already doing, especially if you can do so for almost no cost and very little effort?

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.