ebook img

Transmitting and Gaining Data: Rudolf Ahlswede’s Lectures on Information Theory 2 PDF

471 Pages·2015·5.673 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Transmitting and Gaining Data: Rudolf Ahlswede’s Lectures on Information Theory 2

Foundations in Signal Processing, Communications and Networking 11 Series Editors: Wolfgang Utschick · Holger Boche · Rudolf Mathar Rudolf Ahlswede’s Lectures on Information Theory 2 Transmitting and Gaining Data Alexander Ahlswede · Ingo Althöfer Christian Deppe · Ulrich Tamm Editors Foundations in Signal Processing, Communications and Networking Volume 11 Series editors Wolfgang Utschick, Ingolstadt, Germany Holger Boche, München, Germany Rudolf Mathar, Aachen, Germany More information about this series at http://www.springer.com/series/7603 Rudolf Ahlswede Transmitting and Gaining Data ’ Rudolf Ahlswede s Lectures on Information Theory 2 Editedby Alexander Ahlswede Ingo Althöfer Christian Deppe Ulrich Tamm 123 Author Editors Rudolf Ahlswede (1938–2010) Alexander Ahlswede Faculty ofMathematics Bielefeld Universityof Bielefeld Germany Bielefeld Germany Ingo Althöfer Friedrich-Schiller University Jena Germany Christian Deppe Universityof Bielefeld Bielefeld Germany Ulrich Tamm Bielefeld Universityof AppliedSciences Bielefeld Germany ISSN 1863-8538 ISSN 1863-8546 (electronic) ISBN 978-3-319-12522-0 ISBN 978-3-319-12523-7 (eBook) DOI 10.1007/978-3-319-12523-7 LibraryofCongressControlNumber:2014953292 MathematicsSubjectClassification(2010):94,94A,68P,68Q SpringerChamHeidelbergNewYorkDordrechtLondon ©SpringerInternationalPublishingSwitzerland2015 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpartof the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,broadcasting,reproductiononmicrofilmsorinanyotherphysicalway,andtransmissionor informationstorageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilar methodology now known or hereafter developed. Exempted from this legal reservation are brief excerpts in connection with reviews or scholarly analysis or material supplied specifically for the purposeofbeingenteredandexecutedonacomputersystem,forexclusiveusebythepurchaserofthe work. Duplication of this publication or parts thereof is permitted only under the provisions of theCopyrightLawofthePublisher’slocation,initscurrentversion,andpermissionforusemustalways beobtainedfromSpringer.PermissionsforusemaybeobtainedthroughRightsLinkattheCopyright ClearanceCenter.ViolationsareliabletoprosecutionundertherespectiveCopyrightLaw. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexempt fromtherelevantprotectivelawsandregulationsandthereforefreeforgeneraluse. While the advice and information in this book are believed to be true and accurate at the date of publication,neithertheauthorsnortheeditorsnorthepublishercanacceptanylegalresponsibilityfor anyerrorsoromissionsthatmaybemade.Thepublishermakesnowarranty,expressorimplied,with respecttothematerialcontainedherein. Printedonacid-freepaper SpringerispartofSpringerScience+BusinessMedia(www.springer.com) Preface1 Classical information processing consists of the main tasks of gaining knowledge, storage, transmission, and hiding data. ThefirstnamedtaskistheprimegoalofStatisticsandforthenexttwoShannon presentedanimpressivemathematicaltheory,calledInformationTheory,whichhe based on probabilistic models. Thebasicsinthistheoryareconceptsofcodes—losslessandlossy—withsmall error probabilities in spite of noise in the transmission, which is modeled by channels. Another way to deal with noise is based on a combinatorial concept of error correcting codes, pioneered by Hamming. This leads to another way to look at Information Theory, which instead of being looked at by its tasks can also be classified by its mathematical structures and methods: primarily probabilistic versus combinatorial. Finally, Shannon also laid the foundations of a theory concerning hiding data, calledCryptology.Itstaskisinasensedualtotransmissionandwethereforeprefer to view it as a subfield of Information Theory. Viewed by mathematical structures there is again already in Shannon’s work a probabilistic and a combinatorial or complexity-theoretical model. The lectures are suitable for graduate students in Mathematics, and also in Theoretical Computer Science, Physics, and Electrical Engineering after some preparations in basic Mathematics. The lectures can be selected for courses or supplements of courses in many ways. Rudolf Ahlswede 1 ThisistheoriginalprefacewrittenbyRudolfAhlswedeforthefirst1,000pagesofhislectures. Thisvolumeconsistsofthesecondthirdofthesepages. v Contents Part I Transmitting Data 1 Special Channels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1 Lecture on the Weak Capacity of Averaged Channels . . . . . . . 3 1.1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1.2 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.1.3 A Channel Without Strong Capacity . . . . . . . . . . . . . 5 1.1.4 The Weak Capacity of an Averaged Discrete Channel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.1.5 The Weak Capacity of an Averaged Semi-continuous Channel . . . . . . . . . . . . . . . . . . . . . 11 1.1.6 Nonstationary Averaged Channels . . . . . . . . . . . . . . . 12 1.1.7 Averages of Channels with Respect to General Probability Distributions . . . . . . . . . . . . . . 14 1.2 Lecture on Further Results for Averaged Channels Including CðλÞ-Capacity, Side Information, Effect of Memory, and a Related, Not Familiar Optimistic Channel . . . . . . . . . . . 18 1.2.1 Averaged Channels Where Either the Sender or the Receiver Knows the Individual Channel Which Governs the Transmission . . . . . . . . . 19 1.2.2 Another Channel: The Optimistic Channel . . . . . . . . . 34 1.3 Lecture on The Structure of Capacity Functions for Compound Channels. . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 1.3.1 Definitions and Introduction of the Capacity Functions CðλÞ;CðλRÞ;CðλRÞ . . . . . . . . . . . . . . . . . . 37 1.3.2 Auxiliary Results. . . . . . . . . . . . . . . . . . . . . . . . . . . 40 1.3.3 The Structure of CðλÞ . . . . . . . . . . . . . . . . . . . . . . . 43 vii viii Contents 1.3.4 The Relationships of CðλRÞ, CðλRÞ, and CðλÞ. . . . . . . 51 1.3.5 Evaluation of CðλÞ in Several Examples. . . . . . . . . . . 55 1.4 Lecture on Algebraic Compositions (Rings and Lattices) of Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 1.4.1 A Ring of Channels. . . . . . . . . . . . . . . . . . . . . . . . . 62 1.4.2 Min Channels Known as Compound Channels CC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 1.4.3 Compound Channels Treated with Maximal Coding. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 1.4.4 For Comparison: The Method of Random Codes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 1.4.5 The New Channel ΛC . . . . . . . . . . . . . . . . . . . . . . . 71 1.4.6 The ΛC with Partial Knowledge of the Sender . . . . . . 73 1.5 Lecture on Discrete Finite State Channels. . . . . . . . . . . . . . . . 74 1.5.1 Definitions and Examples. . . . . . . . . . . . . . . . . . . . . 74 1.5.2 Two Performance Criteria: Lower and Upper Maximal Information Rates for the FSC. . . . . . . . . . . 76 1.5.3 Two Further Instructive Examples of Channels . . . . . . 77 1.5.4 Independently Selected States . . . . . . . . . . . . . . . . . . 78 1.5.5 The Finite State Channel with State Calculable by both Sender and Receiver. . . . . . . . . . . . . . . . . . . 82 1.5.6 Stochastically Varying Channels . . . . . . . . . . . . . . . . 83 1.5.7 Random Codes and Weakly Varying Channels . . . . . . 85 1.5.8 Side Information . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 1.5.9 Sources with Arbitrarily Varying Letter Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 1.5.10 Channels with Varying Transmission Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 1.6 Lecture on Gallager’s Converse for General Channels Including DFMC and FSC. . . . . . . . . . . . . . . . . . . . . . . . . . 100 1.6.1 Lower Bounding Error Probabilities of Digits . . . . . . . 100 1.6.2 Application to FSC . . . . . . . . . . . . . . . . . . . . . . . . . 104 1.6.3 Indecomposable Channels. . . . . . . . . . . . . . . . . . . . . 106 1.6.4 A Lower Bound on the Probability of Decoding Error for the FSC . . . . . . . . . . . . . . . . . 110 1.7 Lecture on Information and Control: Matching Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 1.7.1 New Concepts and Results . . . . . . . . . . . . . . . . . . . . 118 1.7.2 Definitions, Known Facts, and Abbreviations . . . . . . . 123 1.7.3 The Deterministic Matching Channel and Matching in Products of Bipartite Graphs. . . . . . . 125 Contents ix 1.7.4 Main Results on Matching in Products of Bipartite Graphs . . . . . . . . . . . . . . . . . . . . . . . . . 128 1.7.5 Matching in Products of Non-identical Bipartite Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 1.7.6 An Exact Formula for the Matching Number of Powers of “Stared” Bipartite Graphs. . . . . . . . . . . . 134 1.7.7 Two Examples Illustrating the Significance of Theorems 45 and 46 . . . . . . . . . . . . . . . . . . . . . . 136 1.7.8 Multi-way Deterministic Matching Channels. . . . . . . . 137 1.7.9 The Controller Falls Asleep—on Matching Zero-Error Detection Codes. . . . . . . . . . . . . . . . . . . . 145 1.7.10 The Matching Zero-Error Detection Capacity Cmde in a Genuine Example. . . . . . . . . . . . . 147 1.7.11 Feedback and also Randomization Increase the Capacity of the Matching Channel . . . . . . . . . . . . 148 1.7.12 The Capacity for Matching Zero-Error Detection Codes with Feedback (MDCF) for W . . . . . 149 0 1.7.13 Identification for Matching Channels . . . . . . . . . . . . . 151 1.7.14 Zero-Error Detection . . . . . . . . . . . . . . . . . . . . . . . . 156 1.7.15 A Digression: Three Further Code Concepts . . . . . . . . 160 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165 2 Algorithms for Computing Channel Capacities and Rate-Distortion Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . 169 2.1 Lecture on Arimoto’s Algorithm for Computing the Capacity of a DMC . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169 2.1.1 Mutual Information and Equivocation. . . . . . . . . . . . . 170 2.1.2 The Algorithm and Its Convergence. . . . . . . . . . . . . . 171 2.1.3 The Convergence of the Algorithm . . . . . . . . . . . . . . 173 2.1.4 Speed of Convergence . . . . . . . . . . . . . . . . . . . . . . . 175 2.1.5 Upper and Lower Bounds on the Capacity . . . . . . . . . 177 2.2 Lecture on Blahut’s Algorithm for Computing Rate-Distortion Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . 178 2.2.1 Basic Definitions and the Algorithm. . . . . . . . . . . . . . 178 2.2.2 Convergence of the Algorithm. . . . . . . . . . . . . . . . . . 180 2.3 Lecture on a Unified Treatment of the Computation of the Δ-distortion of a DMS and the Capacity of a DMS under Input Cost Constraint. . . . . . . . . . . . . . . . . . 182 2.3.1 Preliminaries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182 2.3.2 The Computation of G(s) . . . . . . . . . . . . . . . . . . . . . 183 2.3.3 Capacity Computing Algorithm. . . . . . . . . . . . . . . . . 185 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187 x Contents 3 Shannon’s Model for Continuous Transmission . . . . . . . . . . . . . . 189 3.1 Lecture on Historical Remarks . . . . . . . . . . . . . . . . . . . . . . . 189 3.2 Lecture on Fundamental Theorems of Information Theory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 3.2.1 Stationary Sources. . . . . . . . . . . . . . . . . . . . . . . . . . 193 3.2.2 Methods to Construct Sources. . . . . . . . . . . . . . . . . . 196 3.2.3 The Ergodic Theorem in Hilbert Space. . . . . . . . . . . . 197 3.2.4 The Theorem of McMillan . . . . . . . . . . . . . . . . . . . . 201 3.2.5 Ergodic Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . 210 3.3 Lecture on Stationary Channels. . . . . . . . . . . . . . . . . . . . . . . 211 3.3.1 A Concept of Channels . . . . . . . . . . . . . . . . . . . . . . 211 3.3.2 Methods for the Construction of Channels . . . . . . . . . 214 3.3.3 Ergodic Channels. . . . . . . . . . . . . . . . . . . . . . . . . . . 215 3.4 Lecture on “Informational Capacity” and “Error Capacity” for Ergodic Channels . . . . . . . . . . . . . . . . . . . . . . 217 3.4.1 Definition of the “Informational Capacity” . . . . . . . . . 217 3.4.2 The Coding Theorem. . . . . . . . . . . . . . . . . . . . . . . . 220 3.5 Lecture on Another Definition of Capacity. . . . . . . . . . . . . . . 222 3.6 Lecture on Still Another Type of Operational Capacities—Risk as Performance Criterium . . . . . . . . . . . . . . 229 3.6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229 3.6.2 Nedoma’s Fundamental Paper Has Three Basic Ideas. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230 3.6.3 Report on the Results. . . . . . . . . . . . . . . . . . . . . . . . 230 3.6.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231 3.7 Lecture on the Discrete Finite-Memory Channel According to Feinstein and Wolfowitz. . . . . . . . . . . . . . . . . . 231 3.7.1 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233 3.8 Lecture on the Extension of the Shannon/McMillan Theorem by Jacobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 3.9 Lecture on Achieving Channel Capacity in DFMC . . . . . . . . . 237 3.9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237 3.9.2 A Topology on the Linear Space L¼LPðΩ;BÞ of σ-Additive Finite Real-Valued Functions h:B!R. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237 3.9.3 RðPÞ Is an Upper Semi-continuous Functional (u.s.c) for DFMC. . . . . . . . . . . . . . . . . . . 238 3.10 Lecture on the Structure of the Entropy Rate of a Discrete, Stationary Source with Finite Alphabet (DSS). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 3.11 Lecture on the Transmission of Bernoulli Sources Over Stationary Channels. . . . . . . . . . . . . . . . . . . . . . . . . . . 243 3.11.1 Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.