ebook img

Abstract Methods in Information Theory PDF

261 Pages·1999·84.331 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Abstract Methods in Information Theory

AABBSSTTRRAACCTT MMEETTHHOODDSS IINN IINNFFOORRMMAATTIIOONN TTHHEEOORRYY SERIES ON MULTIVARIATE ANALYSIS Editor: M M Rao Published Vol. 1: Martingales and Stochastic Analysis J. Yeh Vol. 2: Multidimensional Second Order Stochastic Processes Y. Kakihara Vol. 3: Mathematical Methods in Sample Surveys H. G. Tucker Vol. 4: Abstract Methods in Information Theory Y. Kakihara Forthcoming Convolution Structures and Stochastic Processes R. Lasser Topics in Circular Statistics S. R. Jammalamadaka and A. SenGupta AABBSSTTRRAACCTT MMEETTHHOODDSS IINN IINNFFOORRMMAATTIIOONN TTHHEEOORRYY Yuichiro Kakihara Department oj Mathematics University oj California, Riverside USA World Scientific SSiinngaaappoorree *• NNeeww JJeersrseeyy ,L ondon • Hong Kong Published by World Scientific Publishing Co. Pte. Ltd. P O Box 128, Farrer Road, Singapore 912805 USA office: Suite IB, 1060 Main Street, River Edge, NJ 07661 UK office: 57 Shelton Street, Covent Garden, London WC2H 9HE Library of Congress Cataloging-in-Publication Data Kakihara, Yuichiro. Abstract methods in information theory / Yuichiro Kakihara. p. cm. — (Series on multivariate analysis ; v. 4) Includes bibliographical references. ISBN 9810237111 (alk. paper) 1. Information theory. 2. Functional analysis. I. Title. II. Series. Q360.K35 1999 003'.54~dc21 99-31711 CIP British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library. Copyright © 1999 by World Scientific Publishing Co. Pte. Ltd. All rights reserved. This book, or parts thereof, may not be reproduced in any form or by any means, electronic or mechanical, including photocopying, recording or any information storage and retrieval system now known or to be invented, without written permission from the Publisher. For photocopying of material in this volume, please pay a copying fee through the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, USA. In this case permission to photocopy is not required from the publisher. Printed in Singapore by Uto-Print. Dedicated to Professor Hisaharu Umegaki TThhiiss ppaaggee iiss iinntteennttiioonnaallllyy lleefftt bbllaannkk. PREFACE Half a century has passed since C. E. Shannan published his epoch-making paper entitled "A mathematical theory of communication" in 1948. Thereafter the so- called "information theory" began to grow and has now established a firm and broad field of study. Viewing from a mathematical angle, information theory might be thought of having the following four parts: (1) the mathematical structure of information sources, (2) the theory of entropy as amounts of information, (3) the theory of information channels, and (4) the theory of coding. Probabilistic and algebraic methods have mainly been used to develop information theory. Since the early stage of the expansion of information theory, however, measure theoretic and functional analysis methods have also been applied and are providing a powerful tool to obtain rigorous results in this theory. The purpose of this book is to present the first three parts of information theory, mentioned above, in the environment of functional analysis, in addition to probability theory. Here are a couple of examples in each of which functional analysis played a cru­ cial role obtaining important results in information theory. The coincidence of the ergodic capacity C and the stationary capacity C for a certain channel was one e s of the most important problems in the late 1950s. L. Breiman (1960) showed that for a finite memory channel the equality C = C holds and, moreover, C is at­ e s e tained by some ergodic input source (= measure) invoking Krein-Milman's theorem to the weak* compact convex set P(X) of all stationary input sources. Another S such example appeared in a characterization of ergodic channels. In the late 1960s H. Umegaki and Y. Nakamura independently proved that a stationary channel is er­ godic if and only if it is an extreme point of the convex set of all stationary channels. Umegaki observed a one-to-one correspondence between the set of channels and a set of certain averaging operators from the set of bounded measurable functions on the compound space to the set of those functions on the input. Then a channel is identified with an operator, called a channel operator, and hence we can make a full use of functional analysis in studying channels. In this book, readers will find how functional analysis helps to describe information theory, especially the mathematical structure of information souces and channels, in an effective way. Here is a brief summary of this book. In Chapter I, entropy is considered as the amount of information. Shannon's entropy for finite schema is defined and its basic properties are examined together with its axioms. After collecting fundamental prop- vii viii Preface erties of conditional expectation and probability, Kolmogorov-Sinai's entropy is then obtained for a measure preserving transformation. Some fundamental properties of the Kolmogorov-Sinai entropy are presented along with the Kolmogorov-Sinai theo­ rem. Algebraic models are introduced to describe probability measures and measure preserving transformations. Some conjugacy problems are studied using algebraic models. When we fix a measurable transformation and a finite partition, we can consider Kolmogorov-Sinai's entropy as a functional on the set of invariant (with respect to the transformation) probability measures, called an entropy functional. This functional is extended to be the one defined on the set of all complex valued invariant measures, and its integral representation is obtained. Relative entropy and Kullback-Leibler information are also studied in connection with sufficiency which is one of the most important notions in statistics, and with hypothesis testing. In Chapter II, information sources are considered. Using an alphabet message space as a model, we describe information sources on a compact Hausdorff space. Mean and Pointwise Ergodic Theorems are stated and proved. Ergodicity is one of the important concepts and its characterization is presented in detail. Also strong and weak mixing properties are examined in some detail. Among the nonstation- ary sources, AMS (= asymptotically mean stationary) sources are of interest and the structure of this class is studied. Shannon-McMillan-Breiman Theorem is then formulated for a stationary and an AMS source, which is regarded as the ergodic theorem in information theory. Ergodic decomposition of a stationary source is established and is applied to obtain another type of integral representation of an entropy functional. Chapter III, the main part of this book, is devoted to the information channels. After defining channels, a one-to-one correspondence between a set of channels and a set of certain averaging operators is established, as mentioned before. Strongly and weakly mixing channels are defined as a generalization of finite dependent channels and their basic properties are obtained. Ergodicity of stationary channels is discussed and various necessary and sufficient conditions for it are given. For AMS channels, absolute continuity plays a special role in characterizing ergodicity. Capacity and transmission rate are defined for stationary channels. Coincidence of ergodic and stationary capacities is proved under certain conditions. Finally, Shannon's coding theorems are stated and proved. Special topics on channels are considered in Chapter IV. When a channel has a noise source, some properties of such a channel are studied. If we regard a channel to be a vector (or measure) valued function on the input space, then its measurabilities are clarified. Some approximation problems of channels are treated. When the output space is a (locally) compact abelian group, a harmonic analysis method can be applied to channel theory. Some aspects of this viewpoint are presented in detail. Finally, a noncommutative channel theory is introduced. We use a C*-algebra approach to formulate channel operators as well as other aspects of noncommutative Preface IX extension. Another purpose of this book is to present contributions of Professor Hisaharu Umegaki and his school on information theory. His selected papers is published under the title of "Operator Algebras and Mathematical Information Theory," Kaigai, Tokyo in 1985. As one of his students, the author is pleased to have a chance to write this monograph. In the text III.4.5 denotes the fifth item in Section 4 of Chapter III. In a given chapter, only the section and item number are used, and in a given section, only the item number is used. The author is grateful to Professor M. M. Rao at University of California, River­ side (UCR) for reading the manuscript and for the valuable suggestions. UCR has provided the author with a very fine environment, where he could prepare this mono­ graph. He is also grateful for the hospitality of UCR. Yuichiro Kakihara Riverside, California April, 1999 TT hh ii ss Tppaahggisee piissa giinnett eeisnn ittniiootnnenaalltlliyyon llaeefflltty bb llleaafnntkk b.lank.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.