ebook img

Information and Randomness: An Algorithmic Perspective PDF

251 Pages·1994·6.13 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Information and Randomness: An Algorithmic Perspective

Monographs in Theoretical Computer Science An EATCS Series Editors: W. Brauer G. Rozenberg A. Salomaa Advisory Board: G. Ausiello M. Broy S. Even 1. Hartmanis N. Jones T. Leighton M. Nivat C. Papadimitriou D. Scott Cristian Calude Information and Randomness An Algorithmic Perspective Forewords by Gregory J. Chaitin and Arto Salomaa Springer-Verlag Berlin Heidelberg GmbH Author Prof. Dr. Cristian Calude Department of Computer Science, Auckland University Private Bag 92019, Auckland, New Zealand and Faculty of Mathematics, Bucharest University Str. Academiei 14, RO-70109 Bucharest, Romania E-mail: [email protected] Editors Prof. Dr. Wilfried Brauer Institut fUr Informatik, Technische Universitat Miinchen Arcisstrasse 21, 0-80333 Miinchen, FRG Prof. Dr. Grzegorz Rozenberg Institute of Applied Mathematics and Computer Science University of Leiden, Niels-Bohr-Weg I, P. O. Box 9512 NL-2300 RA Leiden, The Netherlands Prof. Dr. Arto Salomaa The Academy of Finland Department of Mathematics, University of Turku FIN-20500 Turku, Finland Library of Congress Cataloging-in·Publication Data Calude, Cristian Information and randomness: an algorithmic perspective 1 Cristian Calude : Forewords by 1. Salomaa and Gregory 1. Chaitin. (Monographs in theoretical computer science) Includes bibliographical references and index. ISBN 978-3-662-03051-6 ISBN 978-3-662-03049-3 (eBook) DOI 10.1007/978-3-662-03049-3 I. Machine theory. 2. Computational complexity. 3. Stochastic processes. I. Title. II. Series: EATCS monographs in theoretical computer science. QA267.C33 1995 003'.54'015113-dc20 94·33125 This work is subject to copyright. All rights are reserved, whether the whole or pan of the material is concerned, specifically the rights of translation, reprinting, re·use of illustrations, recitations, broadcasting, reproduction on microfilms or in other ways, and storage in data banks. Duplication of this publication or pans thereof is only permitted under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer· Verlag Berlin Heidelberg GmbH. Violations fall under the prosecution act of the German Copyright Law. © Springer· Verlag Berlin Heidelberg 1994 Originally published by Springer-Verlag Berlin Heidelberg New York in 1994 Softcover reprint of the hardcover I st edition 1994 The use of registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Production: PRODUserv Springer Produktions·Gesellschaft, Berlin Data conversion by Lewis & Leins, Berlin Cover-layout: MetaDesign plus GmbH, Berlin SPIN 10129482 45/3020·543210 - Printed on acid-free paper A Note from the Series Editors The EATCS Monographs series already has a fairly long tradition of more than thirty volumes over ten years. Many of the volumes have turned out to be useful also as textbooks. To give more freedom for prospective authors and more choice for the audience, a Texts series has been branched off: Texts in Theoretical Computer Science. An EATCS Series. Texts published in this series are intended mostly for graduate level. Typ ically, an undergraduate background in computer science will be assumed. However, the background required will vary from topic to topic, and some books will be self-contained. The texts will cover both modern and classical areas with an innovative approach that may give them additional value as monographs. Most books in this series will have examples and exercises. The original series continues as Monographs in Theoretical Computer Science. An EATCS Series. Books published in this series present original research or material of interest to the research community and graduate students. Each volume is normally a uniform monograph rather than a compendium of articles. The series also contains high-level presentations of special topics. Nevertheless, as research and teaching usually go hand in hand, these volumes may still be useful as textbooks, too. The present volume is an excellent example of a monograph that also has a text book potential. Enjoy! June 1994 w. Brauer, G. Rozenberg, A. Salomaa Editor's Foreword The present book by Calude fits very well in the EATCS Monographs series. Much original research is presented especially on topological aspects of al gorithmic information theory. The theory of complexity and randomness is developed with respect to an arbitrary alphabet, not necessarily binary. This approach is richer in consequences than the classical one. Remarkably, however, the text is so self-contained and coherent that the book may also serve as a textbook. All proofs are given in the book and thus it is not necessary to consult other sources for classroom instruction. The research in algorithmic information theory is already some 30 years old. However, only the recent years have witnessed a really vigorous growth in this area. As a result, also the early history of the field from the mid- 1960s has become an object of debates, sometimes rather hectic. This is very natural because in the early days many authors were not always careful in their definitions and proofs. In my estimation, the present book has a very comprehensive list of ref erences. Often results not at all used in the book are referenced for the sake of historical completeness. The system of crediting original results is stated clearly in the preface and followed consistently throughout the book. May 1994 Arto Salomaa Academy of Finland Foreword Algorithmic information theory (AlT) is the result of putting Shannon's in formation theory and Turing's computability theory into a cocktail shaker and shaking vigorously. The basic idea is to measure the complexity of an object by the size in bits of the smallest program for computing it. AlT appeared in two installments. In the original formulation of AlT, AlTl, which lasted about a decade, there were 2N programs of size N. For the past twenty years, AlTl has been superseded by a theory, AlT2, in which no extension of a valid program is a valid program. Therefore there are much fewer than 2N possible programs of size N. I have been the main intellectual driving force behind both AlTl and AlT2, and in my opinion AlTl is only of historical or pedagogic interest. Unfortunately, AlTl is better known at this time by the general scientific public than the new and vastly superior AlT2• Most people who talk about program-size complexity are unaware of the fact that they are using a com pletely obsolete version of this concept! This book should help to remedy this situation. In my opinion, program-size complexity is a much deeper concept than run-time complexity, which however is of greater practical importance in designing useful algorithms. The main applications of AlT are two-fold. First, to give a mathematical definition of what it means for a string of bits to be patternless, random, unstructured, typical. Indeed, most bit strings are algorithmically irreducible and therefore random. And, even more important, AlT casts an entirely new light on the incompleteness phenomenon discovered by Codel. AlT does this by placing information-theoretic limits on the power of any formal axiomatic theory. The new information-theoretic viewpoint provided by AlT suggests that incompleteness is natural and pervasive and cannot be brushed away in our everyday mathematical work. Indeed, AlT provides theoretical support for a quasi-empirical attitude to the foundations of mathematics and for adopt ing new arithmetical axioms that are not self-evident but are only justified pragmatically. There are also connections between AlT and physics. X Foreword The program-size complexity measure of AIT is analogous to the Boltz mann entropy concept that plays a key role in statistical mechanics. And my work on Hilbert's 10th problem using AIT shows that God not only plays dice in quantum mechanics and nonlinear dynamics, but even in elementary num ber theory. AIT thus plays a role in recent efforts to build a bridge between theoretical computer science and theoretical physics. In this spirit, I should point out that a universal Turing machine is, from a physicist's point of view, just a physical system with such a rich repertoire of possible behavior that it can simulate any other physical system. This bridge-building is also connected with recent efforts by theoretical physicists to understand complex physical systems such as those encountered in biology. This book, benefiting as it does from Cristian Calude's own research in AIT and from his experience teaching AIT in university courses around the world, should help to make the detailed mathematical techniques of AIT ac cessible to a much wider audience. April 1993 G. J. Chaitin IBM Watson Research Center Preface We sail within a vast sphere, ever drifting in uncertainty, driven from end to end. When we think to attach ourselves to any point and to fasten to it, it wavers and leaves us; and if we follow it, it eludes our grasp, slips past us, and vanishes forever. Blaise Pascal This book represents an elementary and, to a large extent, subjective introduction to algorithmic information theory (AIT). As it is clear from its name, this theory deals with algorithmic methods in the study of the quantity of information. While the classical theory of information is based on Shannon's concept of entropy, AIT adopts as a primary concept the information-theoretic com plexity or descriptional complexity of an individual object. The entropy is a measure of ignorance concerning which possibility holds in a set endowed with an a priori probability distribution. Its point of view is largely global. The classical definition of randomness as considered in probability theory and used, for instance, in quantum mechanics allows one to speak of a pro cess (such as a tossing coin, or measuring the diagonal polarization of a horizontally-polarized photon) as being random. It does not allow one to call a particular outcome (or string of outcomes, or sequence of outcomes) random, except in an intuitive, heuristic sense. The information-theoretic complexity of an object (independently introduced in the mid 1960s by R. J. Solomonoff, A. N. Kolmogorov and G. J. Chaitin) is a measure of the difficulty of spec ifying that object; it focuses the attention on the individual, allowing one to formalize the randomness intuition. An algorithmically random string is one not producible from a description significantly shorter than itself, when a universal computer is used as the decoding apparatus. Our interest is mainly directed to the basics of AlT. The first three chap ters present the necessary background, i.e. relevant notions and results from recursion theory, topology, probability, noiseless coding and descriptional complexity. In Chapter 4 we introduce two important tools: the Kraft-Chaitin Theorem (an extension of Kraft's classical condition for the construction of prefix codes corresponding to arbitrary recursively enumerable codes) and XII Preface relativized complexities and probabilities. As a major result, one computes the halting probability of a universal, self-delimiting computer and one proves that Chaitin's complexity equals, within 0(1), the halting entropy (Coding Theorem). Chapter 5 is devoted to the definition of random strings and to the proof that these strings satisfy almost all stochasticity requirements, e.g. al most all random strings are Borel normal. Random sequences are introduced and studied in Chapter 6. In contrast with the case of strings - for which ran domness is a matter of degree, the definition of random sequences is "robust". With probability one every sequence is random (Martin-Lof Theorem) and every sequence is reducible to a random one (Gacs Theorem); however, the set of random sequences is topologically "small". Chaitin's Omega Number, defined as the halting probability of a universal self-delimiting computer, has a random sequence of binary digits; the randomness property is preserved even when we re-write this number in an arbitrary base. In fact, a more general result is true: random sequences are invariant under change of base. We develop the theory of complexity and randomness with respect to an arbitrary alphabet, not necessarily binary. This approach is more general and richer in consequences than the classical one; see especially Sections 4.5 and 6.7. The concepts and results of AIT are relevant for other subjects, for in stance for logic, physics and biology. A brief exploration of some applications may be found in Chapter 7. Finally, Chapter 8 is dedicated to some open problems. The literature on AIT has grown significantly in the last years. Chaitin's books Algorithmic Information Theory, Information, Randomness fj Incom pleteness and Information-Theoretic Incompleteness are fundamental for the subject. Osamu Watanabe has edited a beautiful volume entitled Kolmogorov Complexity and Computational Complexity published in 1992 by Springer Verlag. Ming Li and Paul Vitanyi have written a comprehensive book, An Introduction to Kolmogorov Complexity and Its Applications, published by Springer-Verlag. Karl Svozil is the author of an important book entitled Randomness & Undecidability in Physics, published by World Scientific in 1993. The bibliography tries to be as complete as possible. In crediting a result I have cited the first paper in which the result is stated and completely proven. * I am most grateful to Arto Salomaa for being the springboard of the project leading to this book, for his inspiring comments, suggestions and permanent encouragement. I reserve my deepest gratitude to Greg Chaitin for many illuminating conversations about AIT that have improved an earlier version of the book, for permitting me to incorporate some of his beautiful unpublished results and for writing the Foreword. Preface XIII My warm thanks go to Charles Bennett, Ronald Book, Egon Borger, Wilfried Brauer, Douglas Bridges, Cezar Campeanu, Ion Chitescu, Rusins Freivalds, Peter Gacs, Josef Gruska, Juris Hartmanis, Lane Hemaspaandra (Hemachandra), Gabriel Istrate, Helmut Jurgensen, Mike Lennon, Ming Li, Jack Lutz, Solomon Marcus, George Markowsky, Per Martin-Lof, Hermann Maurer, Ion Mandoiu, Michel Mendes-France, George Odifreddi, Roger Pen rose, Marian Pour-EI, Grzegorz Rozenberg, Charles Rackoff, Sergiu Rudeanu, Bob Solovay, Ludwig Staiger, Karl Svozil, Andy Szilard, Doru ~tefanescu, Garry Tee, Monica Tataram, Mark Titchener, Vladimir Uspensky, Drago§ Vaida, and Marius Zimand for stimulating discussions and comments; their beautiful ideas and/or results are now part of this book. This book was typeset using the 1I\TEX package CLMonoOl produced by Springer-Verlag. I offer special thanks to Helmut Jurgensen, Kai Salomaa, and Jeremy Gibbons - my 'lEX and 1I\TEX teachers. I have taught parts of this book at Bucharest University (Romania), the University of Western Ontario (London, Canada) and Auckland University (New Zealand). I am grateful to all these universities, specifically to the respective chairs loan Tomescu, Helmut Jurgensen, and Bob Doran, for the assistance generously offered. My eager students have influenced this book more than they may imagine. I am indebted to Bruce Benson, Rob Burrowes, Peter Dance, and Peter Shields for their competent technical support. The co-operation with Frank Holzwarth, J. Andrew Ross, and Hans Wossner from Springer-Verlag, was particularly efficient and pleasant. Finally, a word of gratitude to my wife Elena and daughter Andreea; I hope that they do not hate this book as writing it took my energy and at tention for a fairly long period. March 1994 Cristian Calude Auckland. New Zealand

Description:
"Algorithmic information theory (AIT) is the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously", says G.J. Chaitin, one of the fathers of this theory of complexity and randomness, which is also known as Kolmogorov complexit
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.