ebook img

Information theory for electrical engineers PDF

282 Pages·2018·2.9 MB·English
by  GaziOrhan
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Information theory for electrical engineers

Signals and Communication Technology Orhan Gazi Information Theory for Electrical Engineers Signals and Communication Technology More information about this series at http://www.springer.com/series/4748 Orhan Gazi Information Theory for Electrical Engineers 123 Orhan Gazi Department ofElectronics andCommunication Engineering Çankaya University Ankara Turkey ISSN 1860-4862 ISSN 1860-4870 (electronic) Signals andCommunication Technology ISBN978-981-10-8431-7 ISBN978-981-10-8432-4 (eBook) https://doi.org/10.1007/978-981-10-8432-4 LibraryofCongressControlNumber:2018932996 ©SpringerNatureSingaporePteLtd.2018 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpart of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission orinformationstorageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilar methodologynowknownorhereafterdeveloped. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfrom therelevantprotectivelawsandregulationsandthereforefreeforgeneraluse. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authorsortheeditorsgiveawarranty,expressorimplied,withrespecttothematerialcontainedhereinor for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictionalclaimsinpublishedmapsandinstitutionalaffiliations. Printedonacid-freepaper ThisSpringerimprintispublishedbySpringerNature TheregisteredcompanyisSpringerNatureSingaporePteLtd. Theregisteredcompanyaddressis:152BeachRoad,#21-01/04GatewayEast,Singapore189721, Singapore Preface Information is a phenomenon that has meaning in human brain. Almost 70 years ago Shannon published his paper in which he defined fundamental mathematical conceptstoidentifyandmeasureinformation.Sincethen,ahugeimprovementhas occurred incommunication technology. It isvery importantto have knowledge on the fundamental concepts of information theory to understand the modern com- munication technologies. This book has been written especially for electrical and communication engineers working on communication subject. To comprehend the topics included in the book, it isvery essential that the reader has the fundamental knowledge on probability and random variables; otherwise, it will be almost impossible to understand the topics explained in this book. Althoughthisbookhasbeenwrittenforgraduatecourses,anyinterestedperson canalsoreadandbenefitfromthebook.Wepaidattentiontotheunderstandability of the topics explained in this book, and for this reason, we presented the parts in details paying attention to the use of simple and detailed mathematics. We tried to provide detailed solved examples as many as we can. The book consists of four chapters. In Chap. 1, we tried to explain the entropy and mutual information conceptfordiscreterandomvariables.Weadvisetothereadertostudytheconcepts very well provided in Chap. 1 before proceeding to the other chapters. In Chap. 2, entropy and mutual information concept for continuous random variablesareexplainedalongwiththechannelcapacity.Chapter3isdevotedtothe typical sequences and data compression topic. In many information theory books, thechannelcodingtheoremisexplainedasasectionofachapterwithafewpages. However, one of the most important discoveries of the Shannon is the channel coding theorem, and it is very critical for the electrical and communication engi- neerstocomprehendthechannelcodingtheoremverywell.Forthisreason,channel coding theorem is explained in a separate chapter, i.e., explained in Chap. 4, in details.Wetriedtoprovideoriginalexamplesthatillustratetheconceptofrateand capacityachievabilityinChap.4.Sincethisisthefirsteditionofthebook,wejust included very fundamental concepts in the book. In our future editions, we are planning to increase the content of the book considering the recent modern com- munication technologies. v vi Preface As a last word, I dedicate this book to my lovely daughter “Vera GAZİ” who was four years old when this book was being written. Her love was always a motivating factor for my studies. Maltepe-Ankara, Turkey Orhan Gazi September 2017 Contents 1 Concept of Information, Discrete Entropy and Mutual Information. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 The Meaning of Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Review of Discrete Random Variables. . . . . . . . . . . . . . . . . . . . . 3 1.3 Discrete Entropy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.3.1 Interpretation of Entropy . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.3.2 Joint Entropy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.3.3 Conditional Entropy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.3.4 Properties of the Discrete Entropy . . . . . . . . . . . . . . . . . . 23 1.3.5 Log-Sum Inequality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 1.4 Information Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 1.5 Mutual Information. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 1.5.1 Properties of the Mutual Information . . . . . . . . . . . . . . . . 46 1.5.2 Mutual Information Involving More Than Two Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 1.6 Probabilistic Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 1.7 Jensen’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 1.8 Fano’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 1.9 Conditional Mutual Information. . . . . . . . . . . . . . . . . . . . . . . . . . 73 1.9.1 Properties of Conditional Mutual Information . . . . . . . . . . 75 1.9.2 Markov Chain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 1.9.3 Data Processing Inequality for Mutual Information . . . . . . 80 1.10 Some Properties for Mutual Information . . . . . . . . . . . . . . . . . . . 91 2 Entropy for Continuous Random Variables Discrete Channel Capacity, Continuous Channel Capacity. . . . . . . . . . . . . . . . . . . . . . 97 2.1 Entropy for Continuous Random Variable . . . . . . . . . . . . . . . . . . 97 2.1.1 Differential Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 vii viii Contents 2.1.2 Joint and Conditional Entropies for Continuous Random Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 2.1.3 The Relative Entropy of Two Continuous Distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 2.2 Mutual Information for Continuous Random Variables . . . . . . . . . 104 2.2.1 Properties for Differential Entropy . . . . . . . . . . . . . . . . . . 108 2.2.2 Conditional Mutual Information for Continuous Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 2.2.3 Data Processing Inequality for Continuous Random Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 2.3 Channel Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 2.3.1 Discrete Channel Capacity . . . . . . . . . . . . . . . . . . . . . . . . 122 2.4 Capacity for Continuous Channels, i.e., Continuous Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155 2.4.1 Capacity of the Gaussian Channel with Power Constraint . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162 2.5 Bounds and Limiting Cases on AWGN Channel Capacity . . . . . . 165 2.5.1 Effect of Information Signal Bandwidth on AWGN Channel Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165 2.5.2 Effect of Signal to Noise Ratio on the Capacity of AWGN Channel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167 3 Typical Sequences and Data Compression . . . . . . . . . . . . . . . . . . . . 175 3.1 Independent Identically Distributed Random Variables (IID Random Variables) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 3.1.1 The Weak Law of Large Numbers . . . . . . . . . . . . . . . . . . 177 3.2 Convergence of Random Variable Sequences. . . . . . . . . . . . . . . . 178 3.2.1 Different Types of Convergence for the Sequence of Random Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . 179 3.3 Asymptotic Equipartition Property Theorem. . . . . . . . . . . . . . . . . 185 3.3.1 Typical Sequences and Typical Set. . . . . . . . . . . . . . . . . . 186 3.3.2 Strongly and Weakly Typical Sequences. . . . . . . . . . . . . . 191 3.4 Data Compression or Source Coding . . . . . . . . . . . . . . . . . . . . . . 202 3.4.1 Kraft Inequality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206 3.4.2 Optimal Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 3.4.3 Source Coding for Real Number Sequences . . . . . . . . . . . 220 3.4.4 Huffman Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227 4 Channel Coding Theorem. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 4.1 Discrete Memoryless Channel . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 4.2 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236 4.2.1 Probability of Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 4.2.2 Rate Achievability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241 Contents ix 4.3 Jointly Typical Sequences. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244 4.3.1 Jointly Typical Set. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245 4.3.2 Strongly and Weakly Jointly Typical Sequences . . . . . . . . 245 4.3.3 Number of Jointly Typical Sequences and Probability for Typical Sequences . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 4.4 Channel Coding Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264 References.... .... .... .... ..... .... .... .... .... .... ..... .... 273 Index .... .... .... .... .... ..... .... .... .... .... .... ..... .... 275

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.