ebook img

Information theory for continuous systems PDF

321 Pages·30.314 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Information theory for continuous systems

INFORMATION THEORY FOR CONTINUOUS SYSTEMS INFORMATION THEORY FOR CONTINUOUS SYSTEMS Shunsuke lhara Department of Mathematics College of General Education Nagoya University V^g World Scientific IT Singapore * New Jersey • London-Hong Kong Published by World Scientific Publishing Co. Pie. Ltd. POBo* 128, Farter Road, Singapore 912S USA office: Suite IB, 1060 Main Street. River Edge, NJ 07661 UK office: 73 Lynton Mead, Totteridge, London N20 SDH Library of Congress Cataloging-! n-Publicalion Data Information theory for continuous systems / author, Shunsuke Diaia. p. cm. Includes bib1io|raphical references and index. ISBN 98102O985L I. Entropy (Information theory)-Mathematical models. 2. Neural trans mission-Mathematical models. I. Ihaia, Shunsuke. Q370.156 1993 003\54~dc20 93-23179 C1P Copyright © 1993 by World Scientific Publishing Co. Pte. Ltd. All rights reserved. This bode, or parts thereof, may not be reproduced in any form or by any means, electronic or mechanical, including photocopying, recording orany information storage and retrieval system now kiunvn or to be invented, without written permission from the Pubiiiher. For photocopying of material in this volume, please pay a copying fee through the Copyright Ciearance Center, Inc., 27 Congress Street, Salem, MA 01970. USA. Printed in Singapore by Utopia Press. Acknowledgments I would like to express my gratitude to Professor Takeyuki Hida for introducing me to this important research area, and for his valuable suggestions and encour agement, without which I could not have completed the present work. Among the many others who have made contributions, I would particularly like to thank Pro fessors Tunekiti Sirao, Hisao Nomoto, Nobuyuki Ikeda, Hiroshi Kunita, Izumi Kubo, Masuyuki Hitsuda, Charles R. Baker and Kenjiro Yanagi, for valuable comments. Preface This book is intended to provide a comprehensive and up-to-date treatment of information theory with special emphasis on the study of continuous information transmission systems, rather than discrete ones. Information theory in the strictest sense was largely originated by C.E. Shannon. In the paper titled "A mathematical theory of communication" [103], published in 1948, Shannon defined such fundamental quantities as entropy and mutual in formation, introduced general communication systems models, and proved coding theorems. Since then Shannon's basic idea and results have been extended and generalized by information theorists, both in mathematics and engineering. There are two large trends in the development of information theory. The first one is the analysis of information theoretic quantities like entropy and mutual informa tion. The concept of entropy was proposed, in information theory, as a measure of uncertainty or randomness of a random variable, in other words, as a mea sure of the information involved in the random variable. A generalization of the idea of entropy called relative entropy was introduced by S. Kullback {see [83]). This form of information measure, which is also referred to as information diver gence or Kullback-Leibler information number, can be interpreted as a measure of similarity between two probability distributions. Many results for entropy and mutual information can be viewed as particular ones for relative entropy. The second trend is a generalization of coding theorems for more general and more complicated information transmission systems. The field of information theory has grown considerably. Now modern infor mation theory deals with various problems of information transmission, based on probability theory and other mathematical analysis, and is more than a mathemat ical theory of communication. It is tied with estimation theory, filtering theory, and decision theory more closely than ever. vii viii PREFACE Information theory is based on mathematics, especially on probability theory and mathematical statistics. Information transmission is, more or less, disturbed by noises which are arisen from a number of causes. Mathematically, the noises can be represented by suitable random variables or stochastic processes. Moreover, the significant aspect in communication theory is that the actual message or signal to be transmitted is the one selected from a set of possible messages or signals. The system must be designed to operate for every possible selection, not just the one which will be actually chosen, since this is unknown at the time of design. The sets of possible messages and signals are given with probability distributions. Thus messages and signals are also considered to be represented by random variables or stochastic processes. This indicates why the stochastic approach is quite useful for the development of information theory. At the same time, information theory has fundamentally contributed not only to communication theory but also to statistical mechanics, probability theory, and statistics. The contents of this book fall roughly into two parts. In the first part, entropy, mutual information and relative entropy are systematically analyzed and a unified treatment of these quantities in information theory, probability theory and math ematical statistics is presented. Links between information theory and topics that come from probability theory or mathematical statistics are frequently expressed in terms of relative entropy. It is shown that relative entropy is of central im portance in large deviation theorems, hypothesis testing, and maximum entropy methods in statistical inference. The second part deals mostly with information theory for continuous information transmission systems, based on the most recently available methods in probability theory, and analysis developed in the first part. The Gaussian channel is a com munication channel disturbed by an additive noise with Gaussian distribution. It is known, as the central limit theorem, that a noise caused by a large number of small random effects has a probability distribution that is approximately Gauss ian. This indicates that the study of Gaussian channels is especially important not only from the theoretical point of view but also from the viewpoint of application. One of the most interesting and complicated information transmission systems is a system with feedback. We can handle feedback systems on the basis of causal calculus and innovation approach which have been developed in probability theory. There are many excellent books on information theory. While most of them deal with discrete information transmission systems, few of them treat continuous (both in state spaces and time parameter spaces) systems. One of the major aims of this book is to cover the most recent developments in information theory for continuous communication systems including Gaussian channels. PREFACE ix Topics on information theory and stochastic processes are dealt with in moder ately rigorous manner from the mathematical point of view. At the same time, much effort is put into a rapid approach to the essentials of the matter treated and the maintenance of a spirit favorable to application, avoiding extremes of gen erality or abstraction. Most of the results in this book are given as theorems and proofs. It is not necessary to understand the proofs completely to have a general grasp of the subject. The proofs can be skipped on the first reading when desired. They can be studied more carefully on the second reading. It is my hope that this book is accessible to engineers who are interested in the mathematical aspects and general models of the theory of information transmission and mathematicians who are interested in probability theory, mathematical statistics and their applications to information theory. In Chapter 1, we introduce the fundamental quantities of information theory - entropy, relative entropy and mutual information - and study the relationships and a few interpretations of them. They play important roles throughout the text. These quantities are defined for stochastic processes in Chapter 2. Espe cially stationary processes are studied from the information theoretical point of view. Chapter 3 is devoted to the study of large deviation theorems, maximum entropy or the minimum relative entropy spectral analysis, and hypothesis test ing. Relative entropy plays fundamental roles in the investigation of these topics. A large deviation theorem, which is a limit theorem in probability theory, shows that the speed of convergence is described in terms of relative entropy. A model selection that maximizes the uncertainty or entropy yields the maximum entropy approach to statistical inference, and the minimum relative entropy method can be viewed as a generalization of the maximum entropy method. Interpretations of basic concepts in communication theory, such as channel ca pacity and achievable rate of information transmission, are given in Chapter 4. Fundamental problems on information transmission over a communication chan nel are also investigated in Chapter 4. The last two chapters are devoted to a study of information transmission over communication channels with additive noises, es pecially Gaussian channels with or without feedback. Such subjects as mutual information, channel capacity, optimal coding schemes, and coding theorems are investigated in detail. Chapter 5 is for the discrete time case. In particular. Chapter 6 on the continuous time Gaussian channels is unique in this book. Some terminologies and results, which are used in the text, in probability theory are prepared in the Appendix.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.