ebook img

Information Theory and Coding - Solved Problems PDF

515 Pages·2017·8.78 MB·english
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Information Theory and Coding - Solved Problems

š š ć Predrag Ivani Du an Draji (cid:129) Information Theory and Coding - Solved Problems 123 Predrag Ivaniš DušanDrajić Department ofTelecommunications, School Department ofTelecommunications, School ofElectrical Engineering ofElectrical Engineering University of Belgrade University of Belgrade Belgrade Belgrade Serbia Serbia ISBN978-3-319-49369-5 ISBN978-3-319-49370-1 (eBook) DOI 10.1007/978-3-319-49370-1 LibraryofCongressControlNumber:2016957145 ©SpringerInternationalPublishingAG2017 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpart of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission orinformationstorageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilar methodologynowknownorhereafterdeveloped. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfrom therelevantprotectivelawsandregulationsandthereforefreeforgeneraluse. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authorsortheeditorsgiveawarranty,expressorimplied,withrespecttothematerialcontainedhereinor foranyerrorsoromissionsthatmayhavebeenmade. Printedonacid-freepaper ThisSpringerimprintispublishedbySpringerNature TheregisteredcompanyisSpringerInternationalPublishingAG Theregisteredcompanyaddressis:Gewerbestrasse11,6330Cham,Switzerland Preface Theaimofthebookistoofferacomprehensivetreatmentofinformationtheoryand errorcontrolcoding,byusingaslightlydifferentapproachtheninexistedliterature. Therearealotofexcellentbooksthatthreaterrorcontrolcoding,especiallymodern coding theory techniques (turbo and LDPC codes). It is clear that understanding of the iterative decoding algorithms require the background knowledge about the classical codingandinformationtheory.However,theauthorsofthisbookdidnot find other book that provides simple and illustrative explanations of the decoding algorithms,withtheclearlydescribedrelationswiththetheoreticallimitsdefinedby information theory. The available books on the market can be divided in two categories. The first one consists of books that are specialized either to algebraic coding or to modern coding theory techniques, offering mathematically rigorous treatment of the subject, without much examples. The other one provides wider treatment of the field where every particular subject is treated separately, usually with just a few basic numerical examples. In our approach we assumed that the complex coding and decoding techniques cannotbeexplainedwithoutunderstandingthebasicprinciples.Asanexample,for the design of LDPC encoders, the basic facts about linear block codes have to be understood.Furthermore,thefunctionalknowledgeabouttheinformationtheoryis necessary for a code design—the efficiency of statistical coding is determined by theFirstShannontheoremwhereastheperformancelimitsoferrorcontrolcodesare determined with the Second Shannon theorem. Therefore, we organized the book chapters according to the Shannon system model from the standpoint of the information theory, where one block affects the others, so they cannot be treated separately. On the other hand, we decided to explain the basic principles of information theoryandcodingthroughthecomplexnumericalexamples.Therefore,arelatively brieftheoreticalintroductionisgivenatthebeginningofeverychapterincludinga few additional examples and explanations, but without any proofs. Also, a short overviewofsomepartsofabstractalgebraisgivenattheendofthecorresponding chapters.Somedefinitionsaregiveninsidetheexamples,whentheyappearforthe first time. The characteristic examples with a lot of illustrations and tables are v vi Preface chosen to provide a detail insight to the nature of the problem. Especially, some limiting cases are given to illustrate the connections with the theoretical bounds. Thenumericalvaluesarecarefullychosentoprovidethein-depthknowledgeabout the described algorithms. Although the examples in the different chapters can be considered separately, they are mutually connected and the conclusions in one consideredproblemformulatestheother.Therefore,asequenceofproblemscanbe consideredasan“illustratedstoryaboutaninformationprocessingsystem,stepby step”. The book contains a number of schematic diagrams to illustrate the main concepts and a lot offigures with numerical results. It should be noted that the in this book are exposed mainly the problems, and not the simple exercises. Some simple examples are included in theoretical introduction at the beginning of the chapters. The book is primarily intended to graduate students, although the parts of the book can be used in the undergraduate studies. Also, we hope that this book will also be of use to the practitioner in the field. Belgrade, Serbia Predrag Ivaniš Dušan Drajić Contents 1 Introduction.... .... .... ..... .... .... .... .... .... ..... .... 1 2 Information Sources . .... ..... .... .... .... .... .... ..... .... 5 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 5 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 15 3 Data Compression (Source Encoding) .... .... .... .... ..... .... 45 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 45 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 54 4 Information Channels.... ..... .... .... .... .... .... ..... .... 91 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 91 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 112 5 Block Codes.... .... .... ..... .... .... .... .... .... ..... .... 153 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 153 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 164 Brief Introduction to Algebra I... .... .... .... .... .... ..... .... 222 6 Cyclic Codes ... .... .... ..... .... .... .... .... .... ..... .... 237 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 237 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 250 Brief Introduction to Algebra II .. .... .... .... .... .... ..... .... 313 7 Convolutional Codes and Viterbi Algorithm... .... .... ..... .... 327 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 327 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 344 8 Trellis Decoding of Linear Block Codes, Turbo Codes... ..... .... 385 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 385 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 405 vii viii Contents 9 Low Density Parity Check Codes.... .... .... .... .... ..... .... 447 Brief Theoretical Overview. ..... .... .... .... .... .... ..... .... 447 Problems... .... .... .... ..... .... .... .... .... .... ..... .... 459 References.... .... .... .... ..... .... .... .... .... .... ..... .... 509 Index .... .... .... .... .... ..... .... .... .... .... .... ..... .... 515 Chapter 1 Introduction Wearelivingintheeraofdigitalcommunications(transmissionandstorage).Asan important feature of such systems, the transmission errors can be detected and corrected (Error control coding), while in analogue transmission practically only signal-to-noise ratio was considered. Classical communication theory started using Fourier analysis, to be latter enriched by the probabilistic approach. However, it considered only signals—the carriers of information. But Shannon, publishing (1948) his paper “A Mathematical Theory of Communication” (some scientists think that the article should be “The” and not “A”), founded Information theory. TheShannonapproachwastotallydifferentfromtheclassicone.Oneshouldsayit was at a higher level. He did not consider the signals, but the information. The information is represented (encoded) by signals, which are the carriers of infor- mation. Therefore, it is possible that the transmitted signals do not carry any informationatall(ofcourse,thesesignalsmaybeneededforproperfunctioningof the communication system itself). A simplified block-scheme of communication system from the information theory point of view is shown in Fig. 1.1. The encoder can be divided into source encoder (for the corresponding data compression)andchannelencoder(forthecorrespondingerrorcontrolcoding).The encriptor can be added as well. Communication channel is, by its nature, a contin- uous one. Using such an approach the noise and other interferences (other signals, fading etc.) can be directly taken into account. However, it can be simplified introducing the notion of a discrete channel, incorporating the signal generation (modulation), continuous channel and signal detection (demodulation). In contem- porarycommunicationsystems,veryoftentheencodingisconnectedwiththechoice of the corresponding (modulation) signals to obtain better system performances. Generally speaking, the fields of interest in information theory are sources, source encoding, channelsand channel encoding (and thecorresponding decoding). The next three chapters are dealing with the sources, source encoding and the channel. A few elementary examples are given at their beginnings, followed later withmoredifficultproblems.However,themainbodyofthisbookisdealingwith the error control coding (last five chapters). ©SpringerInternationalPublishingAG2017 1 P.IvanišandD.Drajić,InformationTheoryandCoding-SolvedProblems, DOI10.1007/978-3-319-49370-1_1 2 1 Introduction SIGNAL SOURCE ENCODER GENERATOR CHANNEL SIGNAL DESTINATION DECODER DETECTOR DISCRETE CHANNEL Fig.1.1 Asimplifiedblock-schemeofcommunicationsystemfromtheinformationtheorypoint ofview The second chapter deals with the information sources. Mainly discrete sources are considered. The further division into memoryless sources and sources with memory is illustrated. The corresponding state diagram and trellis construction are explained,aswellasthenotionsofadjointsourceandsourceextension.Further,the notions of quantity of information and entropy are introduced. At the end an example of continuous source (Gaussian distribution) is considered. Inthenextchaptersourceencoding(datacompression)isconsideredfordiscrete sources. The important notions concerning the source codes are introduced— nonsingularity,uniquedecodabilityandtheneedforaninstantaneouscode.Notions of a code tree and of average code word length are introduced as well. The short discussion from the First Shannon theorem point of view is included. Further Shannon-FanoandHuffmanencodingalgorithmsareillustratedwithcorresponding problems.AdaptiveHuffmanalgorithms(FGK,Vitter)arediscussedandillustrated. At the end LZ algorithm is considered. Information channels are considered in the fourth chapter. As the sources, the channelscanbeaswell discreteandcontinuous,butthere existsalso amixedtype (e.g. discrete input, continuous output). Discrete channels can be with or without memory. They are described using the corresponding channel matrix. A few dis- crete channels without memory are analyzed in details (BSC, BEC etc.). Transmitted information and channel capacity (for discrete and continuous chan- nels) are defined and the Second Shannon theorem is commented. The decision rules are further analyzed (hard decoding and soft decoding), and some criteria 1 Introduction 3 (MAP, ML) are considered. At the end Gilbert-Elliott model for the channels with memory is illustrated. Inthefifthchapterblockcodes(mainlylinearblockcodes)areillustratedbyusing someinterestingproblems.Atthebeginningthesimplerepetitionscodesareusedto explainFEC,ARQandhybriderrorcontrolprocedures.Hammingdistanceisfurther introduced, as well as Hamming weight and distance spectrum. Corresponding boundsarediscussed.Thenotionofsystematiccodeisintroduced.Hammingcodes areanalyzedinmanyproblems.Thenotionsofgeneratormatrix,parity-checkmatrix and syndrome are explained. Dual codes and McWilliams identities are discussed. Thenotionofinterleavingisillustratedaswell.Attheendofthechapterarithmetic and integer codes are illustrated. At the end of chapter a brief overview of the correspondingnotionsfromabstractalgebraisadded(group,field,vector,space). The sixth chapter deals with cyclic codes, a subclass of linear block codes obtained by imposing on an additional strong structure requirement. In fact, cyclic codeisanidealintheringofpolynomials.Thenotionsofgeneratorpolynomialand parity-check polynomial are introduced. The usage of CRC is illustrated in a few problems. BCH codes are as well illustrated. RS codes are analyzed in details, especially decoding algorithms (Peterson, Berlekamp-Massey, Gorenstein and Zierler, Forney). At the end of chapter a brief overview of the corresponding notions from abstract algebra is added (Galois field, primitive and minimal poly- nomial, ideal). Convolutional codes and decoding algorithms are analyzed in the next chapter. The corresponding notions are explained (constraint length, transfer function matrix,statediagramandtrellis,freedistance).Majoritylogicdecoding,sequential decodingandespeciallyViterbialgorithmareillustratedaswellasthepossibilities of hard (Hamming metric) and soft (Euclidean metric) decision. Punctured codes are explained as well. TCM is considered in details. Trellis decoding of linear block codes and turbo decoding are explained in the chaptereight.Itisshownhowfromsuitabletransformed(trellisoriented)generator matrix, the corresponding parity-check matrix can be obtained suitable for trellis construction. The corresponding decoding algorithms are analyzed (generalized Viterbialgorithm,BCJR,SOVA,log-MAP,max-log-MAP).Attheendturbocodes are shortly described. The last chapter deals with LDPC codes. They provide an iterative decoding with a linear complexity. Tanner interpretation of LDPC codes using bipartite graphs is explained. The various decoding algorithms are analyzed with hard decision (majority logic decoding, bit-flipping) and with soft decision (belief-propagation, sum-product, self-correcting min-sum). At the end the algo- rithms are compared using Monte Carlo simulation over BSC with AWGN. Forreaderconvenience,arelativelybrieftheoreticalintroductionisgivenatthe beginning of every chapter including a few additional examples and explanations, but without any proofs. Also, a short overview of some parts of abstract algebrais givenattheendofthecorrespondingchapters.Thismaterialismainlybasedonthe textbookAnIntroductionintoInformationTheoryandCoding[1](inSerbian)from the same authors. Chapter 2 Information Sources Brief Theoretical Overview Generally, the information sources are discrete or continuous. The discrete source hasfiniteorcountablenumberofmessages,whiletheconsourcemessagesarefrom an uncountable set. In this book will be mainly dealt with discrete sources, espe- cially with those having a finite number of symbols. The further subdivision of sources is according to the memory they may have. The sources can be without memory (zero-memory, memoryless) (Problems 2.1, 2.2,2.3,2.5,2.6and2.8)emittingthesymbols(messages)s accordingonlytothe i corresponding probabilities P(s). Therefore, the zero-memory source is completely described by the list of symbols S (source alphabet) i (cid:1) (cid:3) S s ;s ;...;s 1 2 q and by the corresponding symbol probabilities. PðsÞði¼1;2;...;qÞ: i It is supposed as well that the symbols (i.e. their emitting) is a complete set of mutually exclusive events, yielding Xq PðsÞ¼1: i i¼1 Forthesourceswithmemory,wheretheemittingofthenextsymboldependson mpreviouslyemittedsymbols(misthememoryorder)(Problems2.4and2.7),the corresponding conditional probabilities are needed. ©SpringerInternationalPublishingAG2017 5 P.IvanišandD.Drajić,InformationTheoryandCoding-SolvedProblems, DOI10.1007/978-3-319-49370-1_2

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.