ebook img

DTIC ADA512448: The Resource. Spring 2003 PDF

33 Pages·2.2 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview DTIC ADA512448: The Resource. Spring 2003

Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 3. DATES COVERED 2003 2. REPORT TYPE 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER The Resource. Spring 2003 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION U.S. Army Engineer Research and Development Center,ATTN: ERDC REPORT NUMBER MSRC HPC Resource Center,3909 Halls Ferry Road,Vicksburg,MS,39180-6199 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF ABSTRACT OF PAGES RESPONSIBLE PERSON a. REPORT b. ABSTRACT c. THIS PAGE Same as 32 unclassified unclassified unclassified Report (SAR) Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 From the Director's chair While this is only my second newsletteras Directorofthe Engineer Research and DevelopmentCenterMajorShared ResourceCenter(ERDCMSRC), it is my lD,h year in theCenter, which means I've been hereas long as it has. In all that time there hasbeen only one fundamental shift in the way weget the business ofthe MSRCdone -theshift in the mid-90s from aprimarilyGovernment-owned, Government-operatedorganizationtoan organizlIlion slaffed largely by an integration conlractorled by theGovernment. This wasa big changeaimed at bringing tlexibility into theway theCenters wereoperated. The result was the four Centers wehave now,each a major force in theprovision ofcomputational e,xpertiseand resourcesenabling the DcpanmcllI ofDefense(000) wurfighter mission. That approach served us- and you, ourusers- very well. However, the passageoftime, change in requirements, and continual evolutionoftheGovernment-colllractor landscape brought us tothe point thm we needelltoconsider ..ll1other fundamental shift in the way we get the business oftheMSRCdone. On April 1,2003.that shift wascompleted.TheERDCMSRCcompletely restructured its servicecontracts. moving from the largeumbrellaofasingle integration contract toaconstellationofcontracts that provides clusters of expertiseand functionality for theCenter.Thethree majorareasofthesecontracts are servicedelivery (system administration. technology planning,etc.). serviceSUPPOl1 (helpdesk. computational science. scientificvisualiza tion, etc.),and maintenance. Creating a service contract structure like this affords us avariety ofcritical abilities as wecontinue to striveto improveourservice model. Most importantly, it will allow the ERDCMSRCtorespond more l1exibly 10thechanging demandsofourusercommunity. resultingultimately in a higher level ofservicefor ourusercommunily. Forus. this is asignificantchange that will beaccompilnied by the usual growing pains as we struggle to reshape how we getlhingsdone. Foryou, however, the mechanics ofthischange will be virtually invisible. It is ourgoal that lheonly effectsyou see are more services provided in the way you want them, when you want them. A'? we movethrough thisperiodoftmnsitiOn within the Center, I encourageyou [0continuetogive us feedback about how wecan helpyou moreeffectively.The UsersAdvocacy Group is avery effective way togelyour input to us; wewelcomeandencour..lgeyou to respond individually to usas well.Theonly way totruly match whal we provide10whatyOll need isopen,honest,cOJllinuouscommunication. John West Director,ERDCMSRC About the Cover; Seismic signature simulations for unattended ground sensor systems and numerical simulation of ground vehicle tracking systems using seismic signatures (see article, page 6). Cover design by the EAOC MSRC Scientific Visualization Center. Features An HPC-Enabled Virtual Proving Ground for Seismic Unattended Ground Sensor Networks................................................................... 6 SC2002 “From Terabytes to Insights” ....................................................... 9 PET Highlights....................................................................................... 12 Building on Previous Strategies to Create a Synthetic Application of Benchmarking .............................................................................. 14 Scientific Visualization Center Technology Update ............................... 17 Army Science Conference – “Transformational Science & Technology for the Army...a race for speed and precision” .............. 18 Departments announcements......................................................................................................................2 upcoming events ....................................................................................................................4 off-campus..............................................................................................................................5 technology update................................................................................................................20 Technology Enhancements in the ERDC MSRC Computational Environment ....................20 ERDC MSRC Prepares to Assist Users When Cray X1 Arrives ...........................................21 community outreach.............................................................................................................22 Job Shadowing in Computer-Related Fields ....................................................................22 ERDC MSRC Staffers Share Career Knowledge with Students...........................................23 visitors..................................................................................................................................25 acronyms..............................................................................................................................28 training schedule..................................................................................................................28 p.12 p.6 p.9 pp..1188 The Resource, Spring 2003 ERDC MSRC 1 announcements ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ “Army Newswatch” to Feature DoD Supercomputer Center at ERDC By Rose J. Dykes “Army Newswatch,” SRTV’s premier television news pro- gram, showcases the Army as it conducts its many roles and missions in support of the Nation. It takes a compre- hensive look at what is hap- pening throughout the Army and focuses on the issues, the equipment, and the people that make the Army what it is today. This award-winning television newscast is a biweekly production. Soldiers Radio and Television reporter Hank Heusinkveld films John E. West in the ERDC Supercomputer Center. Soldiers Radio and Television (SRTV) reporter Hank Heusinkveld visited the MSRC on October 24, 2002, and interviewed John E. West, Director, while walking through the Joint Computing Facility. The news feature on the MSRC will eventually air at the Penta- gon, at worldwide Army installations, and on several hundred cable systems in the United States. Other ERDC news to be featured along with that of the Supercomputer Center include ERDC as the 2002 Army Research and Devel- opment Organization of the Year, force protection and antiterrorism research and development, the TeleEngineering Operations (Left to right) Wayne Stroupe, ERDC Public Affairs office, Hank Center, and the new Ship-Tow Simulator. Heusinkveld, SRTV, and John E. West, ERDC MSRC Director, visit before filming the MSRC news feature. The ERDC MSRC welcomes comments and suggestions regarding The Resource and invites article submissions. Please send submissions to the following e-mail address: [email protected] 2 ERDC MSRC The Resource, Spring 2003 announcements ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Users Advocacy Group – A Forum for DoD HPCMP Resource Users By Rose J. Dykes Dr. Stacy Howington, ERDC Coastal and Hydraulics Laboratory, is the newest member of the high- Dr. Stacy Howington received his B.S. and performance computing (HPC) Users Advisory Group M.S. degrees from (UAG). This group was formerly known as the Shared Mississippi State Resource Center Advisory Panel or SRCAP. The UAG University in 1983 and mission has recently been changed as follows: 1988, respectively, in ! Provides a forum for users of the DoD High Perfor- civil engineering. In mance Computing Modernization Program’s 1997, he received his (HPCMP) resources to influence policies and Ph.D. in civil engineer- practices of the Program. ing from the University ! Facilitates the exchange of information between of Colorado at Boulder. the user community and the HPCMP. ! Serves as an advocacy group for all HPCMP users. Dr. Howington works in the ERDC Coastal and ! Advises the HPC Program Office on policy and Hydraulics Laboratory where he does modeling operational matters related to the HPCMP. of fluid flow and constituent transport in groundwater and surface water systems. He has The Army, Navy, and Air Force each appoint four been associated with the Environmental Quality people to represent them as their service members; one Modeling and Simulation and Climate/Weather/ additional member is selected to represent other DoD Ocean Modeling and Simulation Computational agencies. The services are encouraged to appoint Technology Areas for several years. members who are active users in the HPCMP to best represent the user community. The members serve 2-year renewable terms. Meetings are scheduled at least twice a year. Representatives from the Shared Resource Centers, although not members, have a standing invitation to attend the meetings. A user should contact one of the four representatives in his same service. If a user does not work for the Army, Navy, or Air Force, he should contact the representative from other DoD agencies. Users can always contact the Program Office to get the names of their service representa- tives. Appropriate issues for users to take to UAG members are ones that need to be brought before the entire Program for the good of the whole user community – not things such as a machine is not working as it should at a particular site. The full list of HPC UAG members is found below. Air Force Navy ! Jerry Boatz (S&T), [email protected], ! Joe Gorski (S&T), [email protected], (301) (661) 275-5364 227-1930 ! Bonnie Heikkinen (T&E), ! Ed Neal (T&E), [email protected], (301) [email protected], (931) 454-7885 757-1781 ! John Martel (T&E), [email protected], ! Jeanie Osburn (S&T), [email protected], (202) (850) 882-7898, Extension 3368 767-3885 ! Stephen Scherr (S&T), ! Alan Wallcraft (S&T), [email protected], (937) 255-6686 [email protected], (228) 688-4813 Army Other ! Stacy Howington (S&T), ! Steve Finn (DTRA), [email protected], (310) [email protected], (601) 470-2335 634-2939 ! Michael J. Reil (T&E), [email protected], (410) NOTE: S&T (Science and Technology) 278-9474 T&E (Test and Evaluation) ! Stephen Schraml (S&T), [email protected], DTRA (Defense Threat Reduction Agency) (410) 278-6556 ! Jackie Steele (T&E), [email protected], (256) 955-3917 The Resource, Spring 2003 ERDC MSRC 3 announcements ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ Two CSC Employees at ERDC MSRC Honored for HPC Work By Ginny Miller Two Computer Sciences Corporation (CSC) employees were recognized by CSC for their contributions to the company’s high- performance computing (HPC) efforts at the ERDC MSRC. Robert Scudamore, Vice President of CSC’s HPC Center of Excellence, presented the awards November 18, 2002, during the 15th annual Supercomputing Conference (SC2002) in Baltimore, Maryland. Carrie Mahood received CSC’s High Performance Computing Outstanding New Employee Award. The award was in recogni- tion of Mahood’s immediate contributions to the Computational Science and Engineering (CS&E) group at the ERDC MSRC. Her work included a lead role involving testing and analysis on a project with the U.S. Army Corps of Engineers New Orleans District, as well as conducting multilevel parallel programming workshops at the Arctic Region Supercomputing Center in Fairbanks, Alaska, and SC2002. Mahood joined CSC in October 2001 as a computational scientist at the ERDC MSRC. She is a 1999 graduate of East Texas Baptist University in Marshall, Texas, where she earned a bachelor’s degree in mathematics and computer science. Mahood graduated in August 2001 from Texas Tech University with a master of science in mathematics. Scudamore also presented a Technical Contribution Award to Robert Alter in recognition of his continued contributions to the CS&E group as the resident expert of Message Passing Interface- Input/Output, for which he provided invaluable support to the Seismic Wave Propagation in Parallel Topography code. Alter received a bachelor of science in mathematics from Boise State University in Idaho in 1977. He also received a bachelor of science in geophysics from Boise State University in 1983. A former high school mathematics teacher, Alter has worked as an exploration geophysicist for Amoco Oil Production Company and spent 14 years as an oceanographer at the Naval Oceanographic Office at Stennis Space Center. He joined CSC in December 2000. upcoming events ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ 2003 Users Group Conference – June 9-13, 2003, DoubleTree Hotel, Bellevue, Washington, hosted by the DoD High Performance Computing Modernization Program. SC2003 Conference – “Igniting Innovation,” November 15-21, 2003, Phoenix, Arizona. 4 ERDC MSRC The Resource, Spring 2003 off-campus ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ MSRC Team Member Presents Seminar at USM By Rose J. Dykes Dr. Deborah Dent, Deputy Director of the Information Technology Dr. Fred Tracy presents a seminar at USM. Laboratory (ITL), ERDC, and Dr. Fred Tracy, MSRC Team, ERDC, went to the campus of the University of Southern Mississippi (USM), Hattiesburg, Mississippi, on November 1, 2002, where Dr. Tracy presented a seminar entitled “Multi-Level Parallelism (MLP) – An Alternative Parallel Paradigm.” MLP is a new, lightweight approach to expressing the semantics of a parallel program where communication among processors is done through shared variables as in OpenMP rather than sends, receives, broadcasts, reductions, etc., as in Message Passing Interface. The seminar was part of the class SC 740 Graduate Seminar taken by graduate students in the Program of Scientific Computing and related fields of study at USM. Approximately 20 students and faculty attended. After the seminar, Drs. Dent and Tracy visited with faculty members and discussed plans for possibly working on joint projects and future visits to ITL. The Coordinator of the Program of Scientific Computing at USM is Dr. Joseph Kolibal. IEEE Visualization 2002 By Paul Adams Three team members of the ERDC MSRC Scientific Visualiza- tion Center, Paul Adams, Dr. Michael Stephens, and Richard Walters, attended IEEE Visualization 2002 on October 27 – November 1, 2002, in Boston, Massachusetts. Dr. Stephen Wolfram, a well-known scientist who received his Ph.D. in theoretical physics in 1979 at the age of 20, was the keynote speaker. In 1986 Dr. Wolfram created Mathematica, a technical com- (Left to right) ERDC MSRC SVC Lead Paul Adams and puting tool for the scientific research community that is now Dr. Michael Stephens and Richard Walters, SVC team used worldwide. A New Kind of Science is his latest book. members, attended IEEE Visualization 2002. In this book and in his keynote speech at the conference, Dr. Wolfram talked about cellular automata, which produce shaded images on grid patterns according to certain rules. He showed that incredible complexity can arise from simple systems and rules. Other topics of interest at the conference included the following: ! Commodity-Based Cluster Visualization – How to manage and use scalable display walls. ! High-Quality Volume Graphics on Consumer PC Hardware – Using consumer graphics card for volume rendering. ! State of the Art in Data Representation for Visualization – Using signal processing to take laser-scanned data and reduce it to a manageable size. ! Out-of-Core Algorithms for Scientific Visualization – How to handle data sets that are larger than main memory. ! Interactive Rendering of Large Volume Data Sets – Using wavelet compression and Level-of-Detail to view data sets too large to be stored on a PC. ! Exploring Scalar Fields Using Critical Isovalues – Critical isovalues are those that are a minimum, saddle, or maximum. By creating a critical isovalue locating program, one can ensure not missing any important features. The listing of the critical isovalues can also be used to create transfer functions for volume rendering. However, with too many isovalues, the image can become cluttered. ! A New Object-Order Ray-Casting Method – The desire is to have a high-quality, interactive volume-rendering application. The four types of volume-rendering techniques in use today, their advantages, and their drawbacks were addressed. The presenter took the Shear Warp approach and improved its quality while also improving its performance by skipping empty and hidden regions. This method was then compared with the Volume Pro hardware volume-rendering board. The Resource, Spring 2003 ERDC MSRC 5 An HPC-Enabled Virtual Proving Ground for Seismic Unattended Ground Sensor Networks By Dr. Mark L. Moran, Battlefield Seismic-Acoustic Sensors Program, ERDC Cold Regions Research and Engineering Laboratory Comprehensive, reliable, situation information is property contrasts. Tactically significant terrain imperative for the success of light-armor, maneuver- includes large-scale physiographic features (such as dominated Future Combat System (FCS) operations. A forests, hills, passes, narrow valleys, or rivers), which core thrust of the Army Science and Technology is almost axiomatic. These complex battlefield program is directed toward developing an interlocking environments are extremely difficult sensor settings. and overlapping network of ground, air, and national For example, a single impulsive force applied to the asset sensor systems with the aim of delivering a earth’s surface often results in three to five distinct timely and detailed battlefield operational picture to seismic wave fields all propagating along different ray- commanders at all echelons. The ERDC Cold Regions paths, with different amplitudes, different decay rates Research and Engineering Laboratory’s seismic (in both space and time-frequency dimensions), signature simulation Challenge grant is supporting the different polarizations, and with propagation speeds development of a family of intelligent unattended that vary with frequency (i.e., strongly dispersive ground sensors (UGS) including the UGS sensors in the propagation). Moreover, all these seismic wave phases U.S. Army Communication-Electronic Command have complex interactions with topography and Night Vision Laboratory’s Networked Sensors for the geologic discontinuities (reflections, refractions, Objective Force Advance Technology Demonstration, diffractions, and mode conversions). However, it is by the FCS Intelligent Munition System, and the FCS virtue of the signal interaction with geology that results Tactical Unattended Ground Sensor. These systems in the over-the-hill, nonline-of-sight, sensing capabili- rely on seismic and acoustic sensors to detect, track, ties that are important to UGS systems. and classify a wide range of threat targets from heavy armored vehicles to dismounted infantry. Seismic By combining massively parallel HPC computational signals are generated by targets via ground vibrations resources with state-of-the-art numerical methods, a in complex ways that convey target-specific features “virtual proving ground” (VPG) has been developed for useful for classification or identification. For example, simulating the performance of networks of seismic the size and number of track blocks, the diameter and UGS systems in realistically complex geologies. The separation between wheels, or the resonance frequen- resulting simulated data have been demonstrated to be cies of the vehicles sprung mass are readily observable indistinguishable from actual field data, even by in sensor data. Seismic signals arriving at a sensor subject matter experts. The seismic VPG capability has also interact with geologies having large material a wide number of applications that significantly Figure 1. Examples of detail for the M1 main battle tank mechanical model. Other targets available include personnel, T-72, BMP-2, BTR-80, and the HMWVV. Convoys in any number or combination can also be modeled. The detailed mechanical models generate complex distributions of target-specific ground vibrations. 6 ERDC MSRC The Resource, Spring 2003 accelerate the pace of UGS technology development, example, in complex terrains the HPC simulations have improve system reliability, and reduce overall costs. predicted that seismic sensors will perform better For example, using large-scale simulations, new when placed at the top of hills or in deep ravines as methods for adapting networks of intelligent seismic opposed to flat alluvial soil deposits at the base of UGS systems to their specific deployment environment hills. This counter intuitive conclusion is explained by have been developed and demonstrated, providing for the HPC simulation results by noting that strong, robust all-weather target tracking performance. As interfering signal reflections are trapped in the allu- another example, full wave field simulations with this vium whereas the tops of hills (with stiff rock cores) level of fidelity can be used directly for system- and deep ravines (near to water table) have geologies specific engineering in the same manner as field data. with generally higher velocity materials, which do not In the early stages of system development, this saves trap multiple reflected signals. HPC-supported compu- many millions of dollars by reducing the number of tations of this nature are used to develop sensor field studies required to develop system algorithms performance prediction maps and sensor deployment and perform rigorous engineering trade studies to doctrine under a wide variety of geologic conditions. select the optimal sensor suite for a given application. This is important for maximizing information quality Lastly, the simulated data will allow analysis and and coverage area for a fixed number of networked prediction of sensor network performance. For elements. Figure 2. Iconic representation of the seismic virtual proving ground enabled by full exploitation of HPC facilities. The modeling approach uses realistic heterogeneous 3-D geology, soil attenuation, and topography as input along with signature ground vibrations. These inputs are applied to full wave field simulations resulting in simulated data that are indistinguishable from field data. The Resource, Spring 2003 ERDC MSRC 7

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.