ebook img

Computer Systems and Software Engineering: State-of-the-art PDF

425 Pages·1992·28.575 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Computer Systems and Software Engineering: State-of-the-art

Computer Systems and Software Engineering Computer Systems and Software Engineering State-of-the-art Edited by Patrick Dewilde Technische Universiteit Delft and Joos Vandewalle Katholieke Universiteit Leuven SPRINGER SCIENCE+BUSINESS MEDIA, B.V. ISBN 978-1-4613-6555-6 ISBN 978-1-4615-3506-5 (eBook) DOI 10.1007/978-1-4615-3506-5 Printed оп acid-free paper АН Rights Reserved © 1992 Springer Science+Business Media Dordrecht Originally published Ьу К1uwer Academic Publishers in 1992 Softcover reprint of the hardcover 1s t edition 1992 No part of the materia1 protected Ьу this copyright notice mау ье reproduced or uti1ized in any form or Ьу any means, electronic or mechanical, including photocopying, recording or Ьу any information storage and retrieval system, without written permission from the copyright owner. TABLE OF CONTENTS EDITORIAL vii The Evolution of Data Memory and Storage: An Overview Walter E. Proebster Optimizing Logic for Speed, Size, and Testability Robert K. Brayton and Alexander Saldanha 25 VLSI Architectures for Digital Video Signal Processing P. Pirsch 65 Compiler Techniques for Massive Parallel Architectures Lothar Thiele 101 Programmable Cellular Neural Networks. A State-of-the-art Tamas Roska 151 Developments in Parallel Programming Languages R. H. Perrott 169 Load Balancing Grid-Oriented Applications on Distributed Memory Parallel Computers D. Roose, J. de Keyser and R. van Driessche 191 Strategic Decision Analysis and Group Decision Support Simon French 217 Ten Years of Advances in Machine Learning Y. KodratoJf 231 Designing Logic Programming Languages J. W. Lloyd 263 Efficient Bottom-up Evaluation of Logic Programs Raghu Ramakrishnan, Divesh Srivastava and S. Sudarshan 287 Proving Correctness of Executable Programs Kit Lester 325 vi Direct Manipulation as a Basis for Constructing Graphical User Interfaces Coupled to Application Functions Jan van den Bos 355 Visualization of Volumetric Medical Image Data KJ. Zuiderveld and M.A. Viergever 363 Scientific Visualization JJ. van Wijk 387 Picture Archiving and Communication Systems: A Service R. Mattheus 397 February 17, 1992 Editorial: the State of the Art in Computer Science and Software Engineering At the occasion of the CompEuro'92 Conference in The Hague this spring, we have asked an editorial board consisting of some twelve reputed researchers in computer system design and software engineering in the Benelux to assemble a program of State of the Art lectures covering their entire domain, and in which up to date information would be presented to practicing engineers and researchers who are active in the field and have a need to know. We are proud to present the results of our combined efforts in this book. We truly believe that we have succeeded in our goal and that the present book offers the broad overview we aimed at with contributions that treat all the main topics with elegance and clarity. In addition to State of the Art lectures, also some tutorial as well as keynote lectures have been added, all with the purpose of providing the global coverage desired. The order in which the topics are presented goes roughly from bottom to top, from low level hardw'are issues to high level intelligence, from computing to appli cations. Although many topics are incommensurate, we have opted for the more specific first, the more general later, without any presumption. The first lecture is just as valuable as the last: the topics rise in a kind of spiral. There is an un deniable randomness in the choice, the reader will find important and interesting topics from the first to the last contribution. We start out with some survey material. W. Proebster gives a knowledgeable survey of the history of memory management and technology. As a long standing IBM scientist (who is now with the University of Munich), he has had an insider's view on the many developments of memory technology. His conclusion is that the sky is the limit and much more will be forthcoming. The second contribution is also from an outstanding IBM design scientist who is now a Professor at the University of California at Berkeley, Bob Brayton. His topic is, of course, logic design, but with a new view on optimization. Since this is one of the main themes in sound design technology, a. fresh view on it is particularly welcome. The contribution of Peter Pirsch remains in the realm of hardware design, in particular the design of video signal processing algorithms and architectures. Peter's group in Hannover was one of the very first to study and implement sophisticated high speed' ded icated computing systems. He instructs us on how to a.chieve optimization in a global design space in which software, algorithms and architectures interplay. The last contribution on hardware design is due to Lothar Thiele of the University of Saarland in Saarbruecken. Lothar is also very much concerned about the connec tion of algorithms and architectures, especially in the area of massive but nearly regular or piecewise regular computations. He presents a systematic methodology for the design trajectory that carries the designer from concept to architectural description, all within one framework. vii P. Dewilde and J. Vandewalle (eds.). Computer Systems and Software engineering. vii-ix. © 1992 Kluwer Academic Publishers. viii EDITORIAL Moving one level upwards, or if you wish, one spiral turn further, we reach three papers that carry the central term 'parallelism'. The first in the series is devoted to a type of neural networks called cellular neural nets, which may be viewed as a locally distributed neural computer. It yields the promise of a much larger array than possible with the classical multilayer type. The two following papers cover two most important issues in parallel processing: that of programming and that of load-balancing. R. H. Perrott reviews in his paper the various possibili ties in parallel programming languages and how they have developed. In contrast to sequential computing where a single underlying architectural model, the von Neumann machine, is often tacitly assumed, there are many possible architectural models available for parallel computing. Is unity in variety possible? That is the question addressed by Perrot. The paper of Dirk Roose and coauthors brings us back to the theme 'operations research', in their paper dedicated to parallel com puting on distributed memory parallel computers. There could possibly not be a more central paper in this conference since it merges two of the three central themes. Therefore, it a.ppears in the center of the book as well ... The following paper has also a central quality but of a different nature. It brings us another turn further on our high winding spiral. Operations research has very much to do with strategic decision analysis. Decisions are not taken in a vacuum, they originate from group dynamics in a management team. Simon French approaches these issues from an information systems background. It is refreshing to see how on a macroscopic field problems arise that are similar to those on the much more restricted field of software engineering, and how these problems may be approached in an analogous ~ay. This brings us to one of the central themes in software engineering: reasoning and logic programming. Three lectures are devoted to that topic. The first, by Y. Kodratoff gives a survey of ten years of advances in machine learning. Methods for machine learning are built on three kinds of possible inferences: deductive, induc tive and analogical. Kodratoff describes different methods that have been created during the last decade to improve the way machines can learn, using these infer ence techniques. In the second paper, J.W. Lloyd of the University of Bristol dives directly into issues of the design of logic programming languages, their declarative semantics and the software engineering support needed for them. Raghu Ramakr ishnan and coauthors treat in their contribution the evaluation problem of logic programs. Here also, optimization is a key issue, efficiency of computations, mem ory utilization, improvement of query handling. Engineering nowadays requires more than hard nosed i'rule of the thumbi'-ing. Already we have seen that mathematical principles underlie modern principles of declarative parallel programming. Modern signal processing is based on sophis ticated functional analysis. The construction of a high speed, high density VLSI computing system is not possible without the use of highly optimized circuit con struction software which again is based on deep insights in combinational math ematics. So what about software engineering? Kit Lester states our methods of EDITORIAL ix software construction are mainly intuitive and generally have an air of "string and chewing gum" construction. If we are truly to Engineer programs, we instead need mathematically-based methods either of constructing the programs, or of ver ifying intuitively-constructed programs. In his paper he treats not only correctness of program source code or program specification, but goes further and takes the discussion all the way to the executable program. The next-to-last three papers in this volume are devoted to a topic for which a complete State of the Art book could be produced: visualization. At the sys tem's level there is the question of how an application designer could construct a man-machine interface. Jan van den Bos of the Erasmus University in Rotter dam presents a system that allows designers to construct the essential parts of a graphical men-computer interface, the presentation, the dynamic aspects and the coupling to application functions. A beautiful application of visualization princi ples is offered by volumetric medical image data representation. The state of the art in that field is presented by Zuiderveld and Viergever, who pay special atten tion to strategies that improve image generation speed. Techniques to improve the quality of the image are also covered in that paper. J.J. van Wijk of ECN closes the sequence of papers on visualization and gives an overview of methods by which the results oflarge scale simulations and measurements can be presented. The division of the visualization process in different steps, the interaction with the user, the use of hardware and the different software possibilities are all considered. The closing paper is in another direction of application and treats an important problem in medical informatics: that of picture archiving and its connection to communication. The sheer mass of information, the need to store it in an accessible way, and the necessity to communicate that information induce a discipline in its own right which is surveyed by Rudy Mattheus of the Hospital of the Free University of Brussels. The editorial board for this book consisted of the following persons: E. Aarts and F. Lootsma (Optimization and Operations Research), P. Vitanyi (Theory of Computing), H. Sips (Parallel Processing), L. De Raedt and Y. Willems (Machine Learning and Knowledge Acquisition), J. ter Bekke (Databases), A. Bultheel and B. De Moor (Numerical Mathematics), M. Bruynooghe (Computational Logic), M. De Soete (Data Security), F.W. Jansen (Computer Graphics), J. van Katwijk and K. De Vlaminck (Software Engineering and Compiler Construction), P. Suetens (Medical Computing and Imaging). Our thanks go to all of them, for helping us out in selecting lecturers, for advising us on the program and for requesting the contributions. Our very warm thanks go to all the authors who have written such beautiful papers and are introducing us so effectively to the State of the Art in Computer Systems and Software Engineering. Delft, January 1992 Patrick Dewilde and Joos Vandewalle. The Evolution of Data Memory and Storage: An Overview Walter E. Proebster Technische Universitiit Munchen, Institut fur Informatik, Orleansstr. 34, Munchen, Germany Abstract. An overview is given on the development history of the key technologies of data memory and storage. For each of them the essential characteristics for their application in computer systems and also their relation to competing - preceeding or replacing - technologies is described. Comparisons of characteristic values of speed and packaging densities along with a list of historical milestones are added. As a conclusion this overview extending over many decades shows that the progress of this field, which is of dominating importance for system design and application, has not yet reached saturation of progress now and for many years to come. 1. The Role of Data Memory and Storage From the very beginning of data processing, data memory and storage has always played a dominant role in the architecture, the design, the implementation and the operation of data processing systems, be it that we regard a system of the historical past, of the early times of electronic computers, or of our times. The economic importance of data memory and storage is substantial: Almost one third of the world-wide yearly production value of the total systems hardware is devoted to this sector. The demand of computer users for more data memory and storage at improved cost/performance can still not be satisfied, even with the enormous investments in research, development and industry in the past decades and today. The term "memory" is used for random access devices, such as semiconductor memories, the term "storage" for sequential access devices, mostly electromechan ical devices such as disc and tape storage. For reasons of simplification, the term "memory" will be mostly used in the following for both classes. 2. Characterization, Storage and Memory hierarchy Memory and storage units can be characterized mainly by the following three criteria: capacity, in bits, Bytes, kBytes(103 Bytes), MBytes (106 Bytes), GigaBytes (109 Bytes), TeraBytes (1012 Bytes), access and cycle time of a memory/storage cell for writing or reading its content. cost or price related to bits. P. Dewilde and J. Vandewalle (eds.), Computer Systems and Software Engineering, 1-23. © 1992 Kluwer Academic Publishers. 2 W.E. PROEBSTER The large variety of memory technologies naturally leads to enonnous differ ences of capacities, speeds and cost. These differences, to a large extent, influence and determine the application of the various memory classes within a dataprocess ing system and, most important, its proper structure: For fast memories, the costs per bit are high; as a consequence, only small capacities can be realized economically. In contrast, slow memories generally offer low costs per bit and therefore allow the realization of large capacities. These considerations lead to different memory levels embedded in a hierarchical structure: fast memories for registers and smaller buffers (caches), slower random access devices for main memories, fast sequential access devices for back-up storage, slow ones for archival storage. Based on the voluminous literature, one can state without exaggeration that almost all fields of electrotechnology, physics, chemistry and material science have been examined for suitable memory phenomena. Due to the great importance of memory for data processing, physicists, engineers in research, development and industry have been engaged for many decades now to explore, realize and improve new ways and methods of storing data. 3. Memory Evolution and Impacts on and from other technical Disciplines Figure I shows an overview of the most important and interesting memory technologies, the time of their conception, their peak of application and partly also the time of their replacement by other, superior technologies. There are 5 time periods to be distinguished: 1. From ancient times to the end of medieval times with mechanical devices. 2. The time of electromechanical devices up to about 1950. 3. The period of pioneer computers - Zuse, ENIAC, Princeton: lAS - where communication technology provided valuable components for the construction of computer memories such as relays, cross-bar switches, etc. 4. The time of maturity of data processing systems, where reciprocally strong positive impulses were exercised from computer memory technology to com munication technology leading e.g. to electronic telephone exchanges. 5. Our present time, where computer memories profit considerably from devel opments of the entertainment industry such as the compact disc: CD and the digital audio tape: DAT technology.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.