ebook img

Information Theory for Information Technologists PDF

236 Pages·1984·21.171 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Information Theory for Information Technologists

ISBN 978-0-333-36703-2 9 780333 367032 InformationTheory for InformationTechnologists Macmillan ComputerScienceSeries Consulting Editor ProfessorF. H.Sumner,University ofManchester S.T. Allworth,Introduction to Real-timeSoftwareDesign Ian O. Angell, A PracticalIntroduction to ComputerGraphics G.M.BirtwistIe,DiscreteEventModellingon Simula T.B.Boffey,GraphTheory in OperationsResearch Richard Bornat,Understandingand WritingCompilers J.K.Buckle, TheICL 2900Series J.K.Buckle,Software Configuration Management J. C.Cluley,InterfacingtoMicroprocessors Robert Cole, ComputerCommunications DerekColeman,A StructuredProgrammingApproach toData* Andrew J. T.Colin, FundamentalsofComputerScience Andrew J.T. Colin, ProgrammingandProblem-solvinginAlgol68* S.M.Deen, FundamentalsofDataBaseSystems" P.M.Dew and K. R.James,Introduction toNumerical Computation inPascal K.C. E.Gee, Introduction to LocalArea ComputerNetworks J. B.Gosling,DesignofArithmetic Unitsfor DigitalComputers David Hopkinand BarbaraMoss,Automata" Roger Hutty,Fortranfor Students Roger Hutty,Z80AssemblyLanguageProgrammingfor Students Roland N. Ibbett, TheArchitectureofHighPerformance Computers H.Kopetz,SoftwareReliability E.V.Krishnamurthy, Introductory Theory ofComputerScience Graham Lee, From Hardwareto Software:anintroduction to computers A.M.Lister,FundamentalsofOperatingSystems, secondedition* G.P.McKeown and V. J.Rayward-Smith,Mathematicsfor Computing Brian Meek,Fortran,PLj1and theAlgols Derrick Morris, AnIntroduction toSystemProgramming - Basedon thePDPll DerrickMorrisand Roland N.Ibbett, TheMU5 ComputerSystem John Race, CaseStudiesinSystemsAnalysis L.E.Scales, Introduction toNon-LinearOptimization Colin J. Theakerand Graham R.Brookes,A PracticalCourseon Operating Systems M.J.Usher, Information Theory for Information Technologists B.S.Walker, UnderstandingMicroprocessors PeterJ.L.Wallis,PortableProgramming I. R. Wilsonand A.M.Addyman,A PracticalIntroduction toPascal- with BS 6192, secondedition *Thetitles marked with anasterisk wereprepared duringthe Consulting Editorshipof Professor J.S.Rohl, UniversityofWesternAustralia. Information Theory for Information Technologists M. J. Usher Department ofCybernetics University ofReading M MACMILLAN ©M.J.Usher 1984 Allrights reserved.No part ofthis publication may bereproduced or transmitted, inany form or by any means,without permission. Firstpublished1984 by Higher and Further Education Division MACMILLANPUBLiSHERS LTD LondonandBasingstoke Companies andrepresentatives throughout the world Typesetin 10/12PressRoman by RDL Artset Ltd,Sutton,Surrey British Library Cataloguingin Publication Data Usher, J. Information theory for information technologists.-(Macmillan computer science series) 1.Information theory 1.Title 001.53'9 Q360 ISBN978-0-333-36703-2 ISBN978-1-349-17648-9(eBook) DOI 10.1007/978-1-349-17648-9 Contents Preface ix Acknowledgements xi 1 Informationand its Quantification 1.1 Introduction 1 1.2 Quantity ofinformation 2 1.3 Averageinformation;entropy 4 1.4 Redundancy 6 1.5 Exercises on chapter 1 6 2 Informationin Language 8 2.1 Basicprobability theory 8 2.2 Conditional entropy 11 2.3 Redundancy in printed English 12 2.4 Exercises on chapter2 14 3 Binary Coding in NoiselessChannels 16 3.1 Information channels 16 3.2 Binary codes 16 3.3 Compactinstantaneous codes 18 3.4 Coding methods 19 3.5 Shannon's first theorem 20 3.6 Applicationsofcompactcodes 22 3.7 Exercises on chapter 3 23 4 InformationTransferinNoisy Channels 25 4.1 Information innoisy channels 25 4.2 General expression for information transfer 28 4.3 Equivocation 30 4.4 Summary ofbasicformulae by Venn diagram 33 4.5 Channel capacity 34 4.6 Exercises on chapter4 36 5 BinaryCoding in Noisy Channels 38 5.1 Introduction 38 5.2 The binomialprobability distribution 38 v vi CONTENTS 5.3 Binary coding for error protection 39 5.4 Practical codes for error detection and correction 42 5.5 Recent developmentsinerror-protection coding 48 5.6 Exerciseson chapter 5 49 6 Introductionto SignalTheory 51 6.1 Classification ofsignals 51 6.2 Time domain averages 53 6.3 Frequency domain averages 57 6.4 Summary oftime and frequency averages 59 6.5 Exerciseson chapter6 60 7 Electrical Noise 61 7.1 Types ofelectrical noise 61 7.2 Probability distributions 63 7.3 Time and ensemble averages 73 7.4 Summary of the properties of electrical noise 76 7.5 Exercises on chapter7 80 8 Time Domain PropertiesofSignals 81 8.1 Correlation 81 8.2 Autocorrelation 82 8.3 Cross-correlation 84 8.4 Convolution 84 8.5 Applicationsofcorrelation and convolution 88 8.6 Exerciseson chapter8 89 9 Frequency Domain RepresentationofSignals 91 9.1 FourierTheory 91 9.2 Finite power signals: Fourier Series 91 9.3 Finite energy signals: FourierTransforms 101 9.4 Summary 121 9.5 Exerciseson chapter9 121 10 Sampling Theory 127 10.1 Sampled signals 127 10.2 The spectrumofsampled signals 128 10.3 Impulse sampling 131 lOA Recovery ofsignalfrom samples 133 10.5 Exerciseson chapter 10 136 CONTENTS vii 11 InformationinContinuousSignals 138 11.1 Continuoussignals 138 11.2 Relative entropy ofcontinuoussignals 139 11.3 Informationcapacity ofcontinuoussignals 142 11.4 Deductionsfrom the idealtheorem 144 11.5 Exerciseson chapter 11 147 12 CommunicationSystems 149 12.1 Modulation 149 12.2 Analogue modulation 149 12.3 Pulsemodulation 156 12.4 Binary communication systems 160 12.5 Exerciseson chapter 12 165 13 Applicationsof InformationTheory 167 13.1 Typical information channels 167 13.2 Speech processing, synthesis and recognition 173 13.3 Optical character recognition 181 13.4 Musicsynthesis 183 13.5 ~xercises on chapter 13 185 14 Solutionsto Exercises 188 14.1 Solutionsto exerciseson chapter 1 188 14.2 Solutionsto exerciseson chapter2 188 14.3 Solutionsto exerciseson chapter3 190 14.4 Solutionsto exerciseson chapter4 193 14.5 Solutionsto exercises on chapter 5 195 14.6 Solutionsto exercises on chapter6 198 14.7 Solutionsto exercises on chapter 7 199 14.8 Solutionsto exercises on chapter8 203 14.9 Solutionsto exerciseson chapter9 205 14.10 Solutionsto exercises on chapter 10 215 14.11 Solutionsto exercises on chapter 11 215 14.12 Solutionsto exercises on chapter 12 218 14.13 Solutionsto exercises on chapter 13 219 ReferencesandBibliography 222 Index 224 Preface InformationTechnology may be defined asthe acquisition,storage, processing, communication and display ofinformation.Asatechnology itisunique inthat it refers to an abstractconcept -- information;allprevious technologies have been linked to specificphysicalconcepts.InformationTechnologyisthus independentofthe physical technologies used to implementit,and its principles willtherefore continue unchanged despite the considerable technological develop ments expected in the next fewyears. Information Theory,the scienceofquantification,coding andcommunication ofinformation,isthe basisofInformationTechnology.Itisaprecise, quantita tivetheory developed largelyduring and after the Second WorldWar.Some of itstheoremswereat first considered impractical but recent technological develop ments havechanged this and strongly underlined the importance of fundamental theory asabasisfor technology. The aimofthis bookisto present the fundamentals ofInformationTheoryin amannersuitable for persons working in or interested inthe field of Information Technology. Quantity ofinformationisfirst defined,and the conceptsof entropy and redundancy introduced,with application to the Englishlanguage. Informationinnoisy channels isthen considered,and variousinterpretationsof information transfer presented:this leadson to the requirementofcoding information before transmission,eitherin noiselessor noisy channels, and to Shannon's fundamental theorem. In order to consider more generalsignalsthan purely digitalor discrete signals, areviewofsignaland noise theory isrequired,including probability distributions, electrical noise, the Fourier Theory ofsignalsand samplingtheory.This paves the way for astudy of the information capacityofageneralchannel and the idealcommunicationtheorem.Finally areviewofcommunication systems is presentedin the light ofInformationTheory,togetherwith examples of the applicationsof InformationTheory to recent developmentssuch asspeech pro cessingand recognition,sound generation,musicsynthesis and character recognition. The bookisbasedlargely on asecond-year course ofabout30lectures given to Cyberneticshonours degreestudents.The approach adopted isone of intuitive understanding ratherthan rigorous formal theory.It issuitable for both specialistcoursesin InformationTechnology and for the newone-year conver sioncourses;arts-based graduates on these courses areadvisedto omit chapters 8and9 on first reading. Manyofthe workedexamples and exercisesarebased on ReadingUniversity examination questions setby the author,who ispleased ix x PREFACE to acknowledge the University's permission to publish them.Hisonly regretis thathehasseriously depletedhisstock ofsuch questionsfor future years. Early retirementlooms largeindeed. Reading,1984 M.J. USHER

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.