ebook img

Handbook of Differential Entropy PDF

241 Pages·2013·8.07 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Handbook of Differential Entropy

Probability HH aa nn dd bb oo oo kk oo ff Unlike related books, Handbook of Differential Entropy brings DD together background material, derivations, and applications II FF DDIIFFFFEERREENNTTIIAALL of differential entropy. Suitable for students and researchers in FF information theory, the book first reviews probability theory as it EE enables an understanding of the core building block of entropy. The RRHH authors then carefully explain the concept of entropy, introducing EE EENNTTRROOPPYY aa both discrete and differential entropy. They present detailed NNnn derivations of differential entropy for numerous probability models TTdd and discuss challenges with interpreting and deriving differential IIbb AA oo entropy. They also show how differential entropy varies as a function LL oo of the model variance. EEkk Focusing on the application of differential entropy in several NN oo areas, the book describes common estimators of parametric and TT ff nonparametric differential entropy as well as properties of the RR estimators. It then uses the estimated differential entropy to estimate OO radar pulse delays when the corrupting noise source is non-Gaussian PP and to develop measures of coupling between dynamical system YY components. Features • Presents complete derivations of all the included differential entropies M • Provides comprehensive background material on probability i c h theory ana l do • Describes several important applications of differential entropy w B to engineering problems, such as diagnosing structural damage uic cz • Sheds light on challenges associated with interpreting and ho, N Joseph Victor Michalowicz deriving differential entropy for various probability distributions ltzic h • Includes a chapter on entropy as a function of variance o Jonathan M. Nichols l • Places differential entropy in a unique historical and scientific s , context Frank Bucholtz K18975 K18975_Cover.indd 1 10/7/13 12:05 PM H a n d b o o k o f DIFFERENTIAL ENTROPY H a n d b o o k o f DIFFERENTIAL ENTROPY Joseph Victor Michalowicz Jonathan M. Nichols Frank Bucholtz CRC Press Taylor & Francis Group 6000 Broken Sound Parkway NW, Suite 300 Boca Raton, FL 33487-2742 © 2014 by Taylor & Francis Group, LLC CRC Press is an imprint of Taylor & Francis Group, an Informa business No claim to original U.S. Government works Version Date: 20130830 International Standard Book Number-13: 978-1-4665-8317-7 (eBook - PDF) This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use. The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint. Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmit- ted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers. For permission to photocopy or use material electronically from this work, please access www.copyright. com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged. Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com and the CRC Press Web site at http://www.crcpress.com Contents List of Figures ix List of Tables xv Preface xvii 1 Probability in Brief 1 1.1 Probability Distributions . . . . . . . . . . . . . . . . . . . . 3 1.2 Expectation and Moments . . . . . . . . . . . . . . . . . . . 9 1.3 Random Processes . . . . . . . . . . . . . . . . . . . . . . . . 11 1.4 Probability Summary . . . . . . . . . . . . . . . . . . . . . . 15 2 The Concept of Entropy 19 2.1 Discrete Entropy . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.2 Differential Entropy . . . . . . . . . . . . . . . . . . . . . . . 27 2.3 Interpretation of Differential Entropy . . . . . . . . . . . . . 33 2.4 Historical and Scientific Perspective . . . . . . . . . . . . . . 40 3 Entropy for Discrete Probability Distributions 45 3.1 Poisson Distribution . . . . . . . . . . . . . . . . . . . . . . . 45 3.2 Binomial Distribution . . . . . . . . . . . . . . . . . . . . . . 47 3.3 Hypergeometric Distribution . . . . . . . . . . . . . . . . . . 48 3.4 Geometric Distribution . . . . . . . . . . . . . . . . . . . . . 49 3.5 Negative Binomial Distribution . . . . . . . . . . . . . . . . . 50 3.6 Discrete Uniform Distribution . . . . . . . . . . . . . . . . . 52 3.7 Logarithmic Distribution . . . . . . . . . . . . . . . . . . . . 53 3.8 Skellam Distribution . . . . . . . . . . . . . . . . . . . . . . . 55 3.9 Yule-Simon Distribution . . . . . . . . . . . . . . . . . . . . 57 3.10 Zeta Distribution . . . . . . . . . . . . . . . . . . . . . . . . 59 4 Differential Entropies for Probability Distributions 61 4.1 Beta Distribution . . . . . . . . . . . . . . . . . . . . . . . . 61 4.2 Cauchy Distribution . . . . . . . . . . . . . . . . . . . . . . . 67 4.3 Chi Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 71 4.4 Chi-Squared Distribution . . . . . . . . . . . . . . . . . . . . 75 4.5 Dirac Delta Distribution . . . . . . . . . . . . . . . . . . . . 78 4.6 Exponential Distribution . . . . . . . . . . . . . . . . . . . . 80 v vi Contents 4.7 F-Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . 84 4.8 Gamma Distribution . . . . . . . . . . . . . . . . . . . . . . 88 4.9 Generalized Beta Distribution . . . . . . . . . . . . . . . . . 93 4.10 Generalized Normal Distribution . . . . . . . . . . . . . . . . 96 4.11 Kumaraswamy Distribution . . . . . . . . . . . . . . . . . . . 99 4.12 Laplace Distribution . . . . . . . . . . . . . . . . . . . . . . . 102 4.13 Log-Normal Distribution . . . . . . . . . . . . . . . . . . . . 105 4.14 Logistic Distribution . . . . . . . . . . . . . . . . . . . . . . . 109 4.15 Log-Logistic Distribution . . . . . . . . . . . . . . . . . . . . 113 4.16 Maxwell Distribution . . . . . . . . . . . . . . . . . . . . . . 116 4.17 Mixed-Gaussian Distribution . . . . . . . . . . . . . . . . . . 119 4.18 Nakagami Distribution . . . . . . . . . . . . . . . . . . . . . 123 4.19 Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . 125 4.20 Pareto Distribution . . . . . . . . . . . . . . . . . . . . . . . 128 4.21 Rayleigh Distribution . . . . . . . . . . . . . . . . . . . . . . 131 4.22 Rice Distribution . . . . . . . . . . . . . . . . . . . . . . . . 134 4.23 Simpson Distribution . . . . . . . . . . . . . . . . . . . . . . 138 4.24 Sine Wave Distribution . . . . . . . . . . . . . . . . . . . . . 141 4.25 Student’s t-Distribution . . . . . . . . . . . . . . . . . . . . . 144 4.26 Truncated Normal Distribution . . . . . . . . . . . . . . . . . 148 4.27 Uniform Distribution . . . . . . . . . . . . . . . . . . . . . . 152 4.28 Weibull Distribution . . . . . . . . . . . . . . . . . . . . . . . 154 5 Differential Entropy as a Function of Variance 157 6 Applications of Differential Entropy 161 6.1 Estimation of Entropy . . . . . . . . . . . . . . . . . . . . . 162 6.2 Mutual Information . . . . . . . . . . . . . . . . . . . . . . . 172 6.2.1 Mutual Information for Random Processes . . . . . . 172 6.2.2 Radar Pulse Detection . . . . . . . . . . . . . . . . . . 174 6.3 Transfer Entropy . . . . . . . . . . . . . . . . . . . . . . . . . 179 6.3.1 Transfer Entropy for Second-Order Linear Systems . . 183 6.3.2 Nonlinearity Detection . . . . . . . . . . . . . . . . . . 194 6.3.2.1 Method of Surrogate Data . . . . . . . . . . 197 6.3.2.2 Numerical Example . . . . . . . . . . . . . . 198 6.3.2.3 Experimental Application: Diagnosing Rotor- Stator Rub . . . . . . . . . . . . . . . . . . . 199 6.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201 7 Appendices 203 7.1 Derivation of Maximum Entropy Distributions under Different Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203 7.2 Moments and Characteristic Function for the Sine Wave Dis- tribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205 Contents vii 7.3 Moments, Mode, and Characteristic Function for the Mixed- Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . 207 7.4 Derivation of Function L(α) Used in Derivation for Entropy of Mixed-Gaussian Distribution . . . . . . . . . . . . . . . . . 208 7.5 References to Formulae Used in This Text . . . . . . . . . . 213 7.6 Useful Theorems . . . . . . . . . . . . . . . . . . . . . . . . . 214 Bibliography 217 Index 221

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.