Table Of ContentAStudent’sGuidetoCodingandInformationTheory
Thiseasy-to-readguideprovidesaconciseintroductiontotheengineeringbackgroundof
moderncommunicationsystems,frommobilephonestodatacompressionandstorage.
Backgroundmathematicsandspecificengineeringtechniquesarekepttoaminimum,
sothatonlyabasicknowledgeofhigh-schoolmathematicsisneededtounderstandthe
materialcovered.Theauthorsbeginwithmanypracticalapplicationsincoding,includ-
ingtherepetitioncode,theHammingcode,andtheHuffmancode.Theythenexplain
thecorrespondinginformationtheory,fromentropyandmutualinformationtochannel
capacityandtheinformationtransmissiontheorem.Finally,theyprovideinsightsinto
the connections between coding theory and other fields. Many worked examples are
giventhroughoutthebook,usingpracticalapplicationstoillustratetheoreticaldefini-
tions. Exercises are also included, enabling readers to double-check what they have
learnedandgainglimpsesintomoreadvancedtopics,makingthisperfectforanyone
whoneedsaquickintroductiontothesubject.
stefan m. moser isanAssociateProfessorintheDepartmentofElectricalEngi-
neeringattheNationalChiaoTungUniversity(NCTU),Hsinchu,Taiwan,wherehehas
workedsince2005.Hehasreceivedmanyawardsforhisworkandteaching,including
theBestPaperAwardforYoungScholarsbytheIEEECommunicationsSocietyand
ITSociety(Taipei/TainanChapters)in2009,theNCTUExcellentTeachingAward,and
theNCTUOutstandingMentoringAward(bothin2007).
po-ning chen is a Professor in the Department of Electrical Engineering at the
National Chiao Tung University (NCTU). Amongst his awards, he has received the
2000 Young Scholar Paper Award from Academia Sinica. He was also selected as
the Outstanding Tutor Teacher of NCTU in 2002, and he received the Distinguished
TeachingAwardfromtheCollegeofElectricalandComputerEngineeringin2003.
A Student’s Guide to Coding and
Information Theory
STEFAN M. MOSER
PO-NING CHEN
NationalChiaoTungUniversity(NCTU),
Hsinchu,Taiwan
cambridge university press
Cambridge,NewYork,Melbourne,Madrid,CapeTown,
Singapore,Sa˜oPaulo,Delhi,Tokyo,MexicoCity
CambridgeUniversityPress
TheEdinburghBuilding,CambridgeCB28RU,UK
PublishedintheUnitedStatesofAmericabyCambridgeUniversityPress,NewYork
www.cambridge.org
Informationonthistitle:www.cambridge.org/9781107015838
(cid:1)C CambridgeUniversityPress2012
Thispublicationisincopyright.Subjecttostatutoryexception
andtotheprovisionsofrelevantcollectivelicensingagreements,
noreproductionofanypartmaytakeplacewithoutthewritten
permissionofCambridgeUniversityPress.
Firstpublished2012
PrintedintheUnitedKingdomattheUniversityPress,Cambridge
AcatalogrecordforthispublicationisavailablefromtheBritishLibrary
ISBN978-1-107-01583-8Hardback
ISBN978-1-107-60196-3Paperback
Additionalresourcesforthispublicationatwww.cambridge.org/moser
CambridgeUniversityPresshasnoresponsibilityforthepersistenceor
accuracyofURLsforexternalorthird-partyinternetwebsitesreferredto
inthispublication,anddoesnotguaranteethatanycontentonsuch
websitesis,orwillremain,accurateorappropriate.
Contents
Listofcontributors pageix
Preface xi
1 Introduction 1
1.1 Informationtheoryversuscodingtheory 1
1.2 Modelandbasicoperationsofinformationprocessing
systems 2
1.3 Informationsource 4
1.4 Encodingasourcealphabet 5
1.5 Octalandhexadecimalcodes 8
1.6 Outlineofthebook 9
References 11
2 Error-detectingcodes 13
2.1 Reviewofmodulararithmetic 13
2.2 Independenterrors–whitenoise 15
2.3 Singleparity-checkcode 17
2.4 TheASCIIcode 19
2.5 Simplebursterror-detectingcode 21
2.6 Alphabetplusnumbercodes–weightedcodes 22
2.7 Trade-off between redundancy and error-detecting
capability 27
2.8 Furtherreading 30
References 30
3 RepetitionandHammingcodes 31
3.1 Arithmeticsinthebinaryfield 33
3.2 Three-timesrepetitioncode 34
vi Contents
3.3 Hammingcode 40
3.3.1 Somehistoricalbackground 40
3.3.2 Encoding and error correction of the (7,4)
Hammingcode 42
3.3.3 Hammingbound:spherepacking 48
3.4 Furtherreading 52
References 53
4 Datacompression:efficientcodingofarandommessage 55
4.1 Amotivatingexample 55
4.2 Prefix-freeorinstantaneouscodes 57
4.3 Treesandcodes 58
4.4 TheKraftInequality 62
4.5 Treeswithprobabilities 65
4.6 Optimalcodes:Huffmancode 66
4.7 Typesofcodes 73
4.8 Somehistoricalbackground 78
4.9 Furtherreading 78
References 79
5 EntropyandShannon’sSourceCodingTheorem 81
5.1 Motivation 81
5.2 Uncertaintyorentropy 86
5.2.1 Definition 86
5.2.2 Binaryentropyfunction 88
5.2.3 TheInformationTheoryInequality 89
5.2.4 Boundsontheentropy 90
5.3 Treesrevisited 92
5.4 Boundsontheefficiencyofcodes 95
5.4.1 Whatwecannotdo:fundamentallimitations
ofsourcecoding 95
5.4.2 Whatwecando:analysisofthebestcodes 97
5.4.3 CodingTheoremforaSingleRandomMessage 101
5.5 Codingofaninformationsource 103
5.6 Somehistoricalbackground 108
5.7 Furtherreading 110
5.8 Appendix:Uniquenessofthedefinitionofentropy 111
References 112
Contents vii
6 Mutualinformationandchannelcapacity 115
6.1 Introduction 115
6.2 Thechannel 116
6.3 Thechannelrelationships 118
6.4 Thebinarysymmetricchannel 119
6.5 Systementropies 122
6.6 Mutualinformation 126
6.7 Definitionofchannelcapacity 130
6.8 Capacityofthebinarysymmetricchannel 131
6.9 Uniformlydispersivechannel 134
6.10 Characterizationofthecapacity-achievinginputdistri-
bution 136
6.11 Shannon’sChannelCodingTheorem 138
6.12 Somehistoricalbackground 140
6.13 Furtherreading 141
References 141
7 ApproachingtheShannonlimitbyturbocoding 143
7.1 InformationTransmissionTheorem 143
7.2 TheGaussianchannel 145
7.3 Transmissionataratebelowcapacity 146
7.4 Transmissionatarateabovecapacity 147
7.5 Turbocoding:anintroduction 155
7.6 Furtherreading 159
7.7 Appendix:Whyweassumeuniformandindependent
dataattheencoder 160
7.8 Appendix:Definitionofconcavity 164
References 165
8 Otheraspectsofcodingtheory 167
8.1 Hammingcodeandprojectivegeometry 167
8.2 Codingandgametheory 175
8.3 Furtherreading 180
References 182
References 183
Index 187
Contributors
Po-NingChen (Chapter7)
FrancisLu (Chapter3and8)
StefanM.Moser (Chapter4and5)
Chung-HsuanWang (Chapter1and2)
Jwo-YuhWu (Chapter6)