ebook img

Information Measures: Information and Its Description in Science and Engineering PDF

554 Pages·2001·15.906 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Information Measures: Information and Its Description in Science and Engineering

C.Arndt Information Measures Springer-Verlag Berlin Heidelberg GmbH C.Arndt Information Measures Information and its Description in Science and Engineering With 64 Figures Springer Dr.-Ing. Christoph Arndt Universităt Siegen Zentrum fUr Sensorsysteme Projektbereich II Paul-Bonatz-Str.9 -11 D -57068 Siegen Germany e-mail: [email protected] ISBN 978'3'540'40855'0 Library of Congress Cataloging-in-Publication-Data applied for Die Deutsche Bibliothek -Cip-Einheitsaufnahme Arndt, Christoph: Information measures : informat ion and its description in science and engineering 1 C. Arndt. -Berlin; Heidelberg ; New York; Barcelona; Hong Kong ; London ; Milan ; Paris; Singapore; Tokyo: Springer, 2001 ISBN 978-3-540-40855-0 ISBN 978-3-642-56669-1 (eBook) DOI 10.1007/978-3-642-56669-1 This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concer ned, specifically the rights of translation, reprinting, reuse of illustrations, recitations, broadcasting, repro duction on microfilm or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German copyright Law of September 9, 1965, in its cur rent version, and permission for use must always be obtained from Springer-Verlag. Violations are liable for prosecution under the German Copyright Law. http://www.springer.de © Springer-Verlag Berlin Heidelberg 2001 Originally published by Springer-Verlag Berlin Heidelberg New York in 2001 Softcover reprint of the hardcover Ist edition 2001 The use of general descriptive names, registered names trademarks, etc. in this publicat ion does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Typesetting: data delivered by author Cover design: medio Technologies AG, Berlin Printed on acid free paper SPIN: 10784575 62/3020 - 5 4 3 2 1 O TO MY PARENTS NOTRE ;\ME EST JETEE DANS LE CORPS, 00 ELLE TROUVE NOMBRE, TEMPS, DIMENSIONS; ELLE RAISONNE LA-DESSUS ET APPELLE CELA NATURE, NECESSITE, ET NE PEUT CROIRE AUTRE CHOSE. (Pensees, Blaise Pascal) Preface This book is intended to be an introduction to the mathematical description of information in science. The necessary mathematical theory of this introduction will be treated in a more vivid way than in the usual theorem-proof structure. This, however, enables us to develop an idea of the connections between different information measures and to understand the trains of thought in their derivation, which is a crucial point for correct applications. It is therefore our intention in the mathematical descriptions to evolve the important ideas of the derivations, so that we obtain the resulting functions as well as the main thoughts and the conditions for the validity of the result. This simplifies the handling of the information measures, which are sometimes hard to classify without any additional background information. Though the mathematical descriptions are the exact formulations of the measures examined, we do not restrict ourselves to rigorous mathematical considerations, but we will also integrate the different measures into the structure and context of possible information measures. Nevertheless the mathematical approach is unavoidable when we are looking for an objective description and for possible applications in optimization. Communication theory, the original science of telecommunication (i.e. the processing and transmission of information), and physics, the science that established the concept of entropy, are the foundations, serving to explain the idea of information. As there exist a great number of different possible ways to describe information, we are going to present these measures in a coherent description. Some examples of the information measures examined are: Shannon's information, applied in coding theory, Akaike's information criterion, used in system identification to determine auto-regressive models and in neural networks to identify the number of neurons, and Cramer-Rao bound or Fisher's information, describing the minimal variances achieved by unbiased estimators. This coherent description enables us to obtain a more complete overview of the different approaches and a better understanding of the idea of information and the related mathematical descriptions. In the context of the great number of information functions, which currently exist, one may ask for reasonable applications of all these information measures. Does it make sense to generate additional degrees of freedom by creating more parameters to the existing information functions? Would it not be more reasonable to use the existing information measures, which we are able to understand and to apply much better than the extensions created by the additional parameters? Of course, additional parameters can be used to match certain information measures to certain kinds of problems, but do we need a separate information measure for each kind of problem? Here the user has to decide between the different possible information measures, which we are going to introduce. From the users point of view, we should argument with Occams razor not to multiply the entities (degrees of X Preface freedom) in an unnecessary way. The more complex the information measure becomes the more we loose the ability to understand and to interpret the results of our computations. So the interpretation of the results of the computations encourage the use of the least possible number of degrees of freedom, which we should permit in an applicable information measure. Based on this reflection, we are going to introduce several information measures and examine their connections. But the examination of information transmission and the possible optimizations are demonstrated by using the main important, i. e. the most common and uncomplex information measures, which already allow us to draw a lot of conclusions for signal processing applications. Table of Contents Symbols, expressions and abbreviations ................................... XVII Abstract .......................................................................................................... 1 Structure and Structuring .................................................................... 3 1 Introduction ............................................................................................. 7 Science and information ...................................................................... 8 Man as control loop .......................................................................... 13 Information, complexity and typical sequences ................................ 14 Concepts of information .................................................................... 15 Information, its technical dimension and the meaning of a message 16 Information as a central concept? ..................................................... 18 2 Basic considerations ........................................................................... 23 2.1 Formal derivation of information .......................................................... 23 2.1.1 Unit and reference scale ................................................................. 28 2.1.2 Information and the unit element ................................................... 30 2.2 Application of the information measure (Shannon's information) ....... 31 2.2.1 Summary ........................................................................................ 39 2.3 The law of Weber and Fechner ............................................................ .42 2.4 Information of discrete random variables ............................................ .44 3 Historic development of information theory ........................... .4 7 3.1 Development of information transmission ........................................... .47 3.1.1 Samuel F. B. Morse 1837 .............................................................. .47 3.1.2 Thomas Edison 1874 ..................................................................... .47 3.1.3 Nyquist 1924 ................................................................................ .48 3.1.4 Optimal number of characters of the alphabet used for the coding 49 3.2 Development of information functions ................................................. 51 3.2.1 Hartley 1928 ................................................................................... 51 3.2.2 Dennis Gabor 1946 ........................................................................ 52 3.2.3 Shannon 1948 ................................................................................. 53 3.2.3.1 Validity of the postulates for Shannon's Information .............. 57 3.2.3.2 Shannon's information (another possibility of a derivation) ... 59 XII Table of Contents 3.2.3.3 Properties of Shannon's information, entropy ......................... 61 3.2.3.4 Shannon's entropy or Shannon's information? ....................... 66 3.2.3.5 The Kraft inequality ................................................................. 67 Kraft's inequality: ............................................................................. 67 Proof of Kraft's inequality: ............................................................... 68 3.2.3.6 Limits of the optimal length of codewords .............................. 75 3.2.3.6.1 Shannon's coding theorem ............................................... 75 3.2.3.6.2 A sequence ofn symbols (elements) ................................ 76 3.2.3.6.3 Application of the previous results ................................... 79 3.2.3.7 Information and utility (coding, porfolio analysis) ................. 82 4 The concept of entropy in physics ................................................ 85 The laws of thermodynamics: ........................................................... 85 4.1 Macroscopic entropy ............................................................................. 86 4.1.1 Sadi Carnot 1824 ......................................................................... 86 4.1.2 Clausius's entropy 1850 ................................................................. 86 4.1.3 Increase of entropy in a closed system ........................................... 87 4.1.4 Prigogine's entropy ........................................................................ 88 4.1.5 Entropy balance equation ............................................................... 89 4.1.6 Gibbs's free energy and the quality of the energy .......................... 90 4.1.7 Considerations on the macroscopic entropy ................................... 91 4.1.7.1 Irreversible transformations ..................................................... 92 4.1. 7.2 Perpetuum mobile and transfer of heat.. .................................. 93 4.2 Statistical entropy .................................................................................. 94 4.2.1 Boltzmann's entropy ...................................................................... 94 4.2.2 Derivation of Boltzmann's entropy ................................................ 95 4.2.2.1 Variation, permutation and the formula of Stirling .................. 95 4.2.2.2 Special case: Two states ........................................................ 100 4.2.2.3 Example: Lottery ................................................................... 101 4.2.3 The Boltzmann factoL .................................................................. 102 4.2.4 Maximum entropy in equilibrium ................................................ 106 4.2.5 Statistical interpretation of entropy .............................................. 112 4.2.6 Examples regarding statistical entropy ........................................ 113 4.2.6.1 Energy and fluctuation ........................................................... 115 4.2.6.2 Quantized oscillator ............................................................... 116 4.2.7 Brillouin-Schrodinger negentropy ............................................... 120 4.2.7.1 Brillouin: Precise definition of information ........................... 121 4.2.7.2 Negentropy as a generalization ofCarnot's principle ............ 124 Maxwell's demon ............................................................................ 125 4.2.8 Information measures of Hartley and Boltzmann ........................ 126 4.2.8.1 Examples ............................................................................... 128 4.2.9 Shannon's entropy ........................................................................ 128 4.3 Dynamic entropy ................................................................................. 130 4.3.1 Eddington and the arrow of time .................................................. 130 4.3.2 Kolmogorov's entropy ................................................................. 131

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.