ebook img

Probability and Information Theory: Proceedings of the International Symposium at McMaster University, Canada, April, 1968 PDF

260 Pages·1969·3.22 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Probability and Information Theory: Proceedings of the International Symposium at McMaster University, Canada, April, 1968

Lecture Notes ni Mathematics A collection of informal reports and seminars Edited yb .A Dold, Heidelberg and .B Eckmann, Zarich 89 I (cid:12)9 ytilibaborP dna noitamrofnI Theory Proceedings of the International Symposium at McMaster University, Canada, April, 1968 Edited yb .M Behara, .K Krickeberg, and .J Wolfowitz 1969 galreV-regnirpS Berlin. Heidelberg. New kroY All rights reserved. No part of this book may be translated or reproduced in any form without written permission from Springer Verlag. (cid:14)9 by Springer-Verlag Berlin- Heidelberg 9691 Library of Congress Catalog Card Number 76-80068 Printed in Germany. Title No. 5963 Preface This volume contains the invited lectures presented at the First International Symposium on Probability and Information Theory which was held at McMaster University, Hamilton, Ontario, Canada, April 4th and 5th, 1968. The purpose of the Symposium was to bring together for discussion workers in probability theory and information theory, and to provide relatively quick publication for their contributions. We would llke to thank Professor H.G. Thode, M.B.E., F.R.S.C., F.R.S., President of McMaster University, for his address of welcome, and Professor A.N. Bourns, F.R.S.C., Vice-President, Academic (Science) of McMaster, for acting as host to the participants in the International Symposium. We also thank Professor Husaln, Chairman of the Mathematics Department, and Professor G. Bruns for their financial and overall support, without which this Symposium would never have taken place. We gratefully acknowledge major financial support from the National Research Council of Canada. Finally, we take great pleasure in thanking Professor I.Z. Chorneyko for his valuable help in organizing the Symposium. M. Behara, K. Krickeberg, .J Wolfowitz Contents Aczel, J.: On Different Characterizations of Entropies ............................ 1 Ahlswede R., and J. Wolfowitz: The Structure of Capacity Functions for Compound Channels ............................................................. 12 Sarahamihir M., and D. Behara: Boolean Algebraic Methods in Markov Chains ......... 55 B~llingsley, P.: Maxima of Partial Sums ........................................... 64 Campbell, L. Lo: Series Expansions ~r Random Processes ............................ 77 CsSrgS, M.: Glivenko-Cantelli Type Theorems for Distance Functions based on the Modified Empirical Distribution Function of M. Kac and for the Empirical Process with Random Sample Size in General ................. 96 Husain, T~ On the Continuity of Markov Processes ................................. 99 Kac, M.: Some Mathematical Problems in Statistical Mechanics ...................... 106 Kambo, N. S., and S. Kotz: Asymptotic Behaviour of the Average Probability of Error for Low Rates of Information Transmission ...................... 125 Kemperman, J. H. B.: On the Optimum Rate of Transmitting Information, ............. 126 Krengel, U.: A Necessary and Sufficient Condition for the Validity of the Local Ergodic Theorem ...................................................... 170 Krickeberg, K.: Recent Results on Mixing in Topological Measure Spaces ............ 178 Padmanabhan, A. R.: Convergence in Probability and Allied Results ................. 186 Pyke, R.: Applications of Almost Surely Convergent Constructions of Weakly Convergent Processes ................................................. 187 Spitzer, F.: Random Processes Defined through the Interaction of an Infinite Particle System ...................................................... 201 Strassen, V., and R. M. Dudley: The Central Limit Theorem and~ -Entropy ........... 224 Weiss, L., and J. Wolfowltz: Maximum Probability Estimators with a General Loss Function ............................................................. 232 -1- On Different Characterizations of Entropies .J Aczel, University of Waterloo, Ont. PROPERTIES When defining the entropy Ha(pl,P2,...,pn) of n mutually exclusive events (outcomes of an experiment, possible messages, etc.) with probabilities pl,P2,...,p n (pk ~ 0; k=l,2,...,n10~Pl+P2+...@Pn~l~ if we allow only pl+P2+...,+Pn=l, which is the case for complete systems of events, we may emphasize this by writing Kn(pl,P2,...,pn)) as measure of uncertainty or of information and the information- gain In(pl,P2,...,pnl rl,r2,...,rn)(res p. jn(pl,P2,...,pn~ rl,r2,...,rn) when pl+P2§ we may expect more or less naturally that it has many or all of the following properties: .1 Algebraic properties. 111. Unit: HI(~) = 1. or 112. , t~,~j = 1. 12. Symmetry: Hn(pl,P2,...,pn)=En(Pk(1),Pk(2),...,Pk(n)) , where k(1), )n(kt..,)2(k is an arbitrary permutation of $,2,...,n in particular (n=3)~ 121. H3(pl,P2,p3) is symmetric in pl,P2,p3 , and (n=2, P1=1, P2=O): 122. ~2(1,0)=~2(0,1). 13. Null-entropy: Hn(O,O,...,O,1) = O, in particular (n = 1 or n = ,2 respectively)~ 131. H1(1) - O, 132. H2(0,1)=0. -- 2 -" 14-. Null-probabilities: Hn+l(pl,P2,...,pn~ ) = Hn(pl,P2,...,pn .) 151. Strong additivity: Kmn(plqll,plq12,...,plqln,p2q21,p2q22,...,p2q2n,... , m Pmqml 'Pmqm2 ' " "" 'Pmqmn)=Km(pl 'P2'" " " 'Pm )+ j~=l pjKn(qJl 'qj2'" " " 'qjn ") in particular (n~ m~l, qjj=l, qjl ..... qj,j-l=qj,j+l .... qjn=0 for ,1-m,,..,2,l=j qml = qm2 ..... qmm_l=O, taking also 12,13,14 into consideration): 1511. Kn(pl,P2,---,Pm_l,Pmqmm,Pmq~m+l,---,Pmqmn)=Km(pl,P2,---,Pm)+ +pm Kn-m+l (qmmSm,m+l,...,qm,n) . Or, more specially (n=m+l,qmm=q, in particular n=3) 15111. Km+l(pl,P2,...,Pm_l,pmq,pm(1-q)) = Km(pl,P2,...,pm) + PmK2(q,l-q)t .)q-I,q(2K3p+)2P,lp(2K=))q-1(2P,q2P,lP(3K 151111. Again, more generally, there exists a two-place functiom L (in 15111 L(Pm,q)=PmK2(q,l-q)) such that 152. Km+l(pl,P2,...,Pmq,pm(1-q))-Km(pl,P2,...,pm)=L(Pm,q). In the case where qjl=qj2 ..... qjn=qj (j=l,2,...,m), 151 becomes true for all H m (not only for the Km): 153. Additivity: Hmn(pl~l,plq2,...,plqn,p2ql,p2q2,...,p2qn,...,pmql,pmq2,...,R~n ) = = Hm(pl 'P2'''" 'Pm)+Hn(ql'q2'''" 'qn ') of which we mention the special cases n=2,ql=q,q2=l-q: 1531. H2m(plq,Pl(1-q),p2q,p2(1-q),...,pmq,pm(1-q))=Hm(pl,P2,...,pm)+H2(q,l-q)z or m=2,n=l,ql=q: 1532. H2(Plq,P2q)=H2(pl,P2)+HI(q), or m=n=l,Pl=p,ql=q: Hl(qp)=Hl(p)+Hl(q), 15}21. and m=n: 1533. Hn2(plql,plq2,...,plqn,p2ql,p2q2,...,p2qn,...,pnql,pnq2,...,pnqn ) = = Hn(pl,P2,...,pn)+Hn(ql,q2 .... ,qn .) -3- For the information-gain we have similarly among others I 11. Unit: I2(11~)=1. I 12. Symmetry: In(pl,P2,...,pnl rl,r2,--.,rn)=In(pk(1),Pk(1),Pk(2),---Pk(n)~ qk(1),qk(2),...,qk(n)), where k(1),k(2),...,k(n) is an arbitrary permutation of 1,2,...,n . 1 153. Additivity: Imn(plql,plq2, .... plqn,p2q 1,p2q2,...,p2qn,...,pmq~pmq2,... , (cid:12)9 --,Pmqn I rlsl,rls ,2 .... rlSn,r2sl,r2s ,2 .... r2Sn, .... rmSl,rmS2,...,rmSn ) = = Im(pl,P2,...,Pmlrl,r 2 ..... rm)+In(ql,q2,...,qn ~ Sl,S2,---,Sn). .2 Inequalities. .q2 Nonnegativity: Hn(pl 'P2'" " " 'Pn ) --~ ,0 in particular ")1=.31( .112 H1 (Pl) ~ .0 m n A common generalization of 151 and 153 is (pjk=pjqjk _~ ,0 l=~j 1--k~ Pjk=l) .22 Generalized additivity: t~Pll 'P12'" " " 'Pln'P21 ~P22'"" " 'P2n'""" 'Pml 'Pm2'""" ' n n .31 "'''Pmn)~Km(2 P1 m- -k k) + KPlk' n ( j~-- k=l ~ = P2kl''''' " ~ k=l P~I' i 1--~j P~2'''" i ' ~lPJn ) " .31 .32 Maximum-entropy for equal probabilities: Hn(pl 'P2''" " 'Pn ) ~Hin(~__ Pk/n' n Pk/n''''' ~ Pk / n), k=l k=l in particular 132 (cid:12)9 Kn(pl,P2,...,pn) ~ Kn(1/n,1/n,...,1/n) I which, with 41 implies also .1132 E .31 l_f 1 ,~)1 ( Kn+l 1'~7-n( 1 1 ~'" n+l' "''n+l- ) "n' "" " " For the information-gains we note only 1 211. Nonnegativity for complete systems of events: jn(pl'P2'''" 'Phi ql'q2'''" 'qn )-~ 0, and for incomplete systems of events 1 212. In(pl'P2' .... Pnl ql'q2'''''qn ) ~0 if pk ~_rk(k=l,2,...,n) while In(pl,~,..., "''' lnP rl'r2'''''rn)~0' if Pk~rk(k--1,2,...,n). -4- .3 Representation n .13 Simple sum: Ka(pl,P2,...,pn ) = ~ f(pk .) k=l n n .23 Weighted quasiarithmetic mean Hn(pl 'P2'''" nP' )=g-1 ( "-~ w(P")g(H1 (Pk) ~/K ))kP(w k=1 ~ i--k More specially (w(p);p); n n 321. Quasiarithmetic mean: Hn(pl 'P2'''" nP' ) = g-l(~Pkg(H1(Pk))/~ Pk ') k=l k=d in particular (n=2): lP( g(H1 (Pl)+P2 g(H1 (P2) ) .) 3211. H2(pl ,p2)=g I- p1+p2 For the K-s bl( is the logarithmus with basis :)2 n 3212. Kn(Pl ,P2,"" 'Pn ) " g-1 (X2 p~g(-n~k~) ). k=l Even more specially (g(x)=x) n n 3213. Arithmetic mean: Hn(pl 'P2'""" nP' =) ~- Pk Hq (Pk)/k~lPk 1--k -- n Pk ~) .33 P~tial sum combination: Kn(pl,p2,. "" nP~ =) ~ (PI+P2 ~+ "" +Pk)h(pI+P2+..-+Pk k=1 in particular (n=2 and n=3, respectively) : 331. K2(pl ,P2) =h(P2 ) , p(3:~ 1 ,p2,p3)- )3~_Pl(h)3p-1( +h(P3)- 332. For information-gains we mention only n ~_1 @ I 321. Quasiarithmetic mean: In(p1 'P2"'" nP' I r I ,r2,"" ,rn)=g-1 ( ~ Pk g(ll )kgkP( k=1 I .43 Difference combination: jn(pl 'P2' " " " 'Pk'" " " 'Pnl rl 'r2'""" 'rn)= n I .53 Quotient combination: jn(pl 'P2'""" 'Pnlrl 'r2'""" 'rn)= n log (Zp~,~(pk)/,~(rk)). c : k--1 We observe that the algebraic properties 151, 1511, 15111, 151111, 152 also contain representation statements, and similarly also the above representa- tion properties~ for instance 321, 3212, I 321, could be formulated as -5- "algebraic" conditions. 4. Regularity 41. Pk -~ Hn(pl 'P2' """ 'Pn ) (k=1,2,...,n) are con~nuous in 0,1. In particular: 411, H 1 is continuQus. #12. Null-probabilities: aril ~+1(pl,pa,---,pn~n+1)=~(pl,p2,---,pn~ Pn+1 -~0 in particular (n=1): lim H2%pl,p2)-H1(pl). 4121. p2 ~ 0 For the K-s also (cf. 331) 42. p2-r is continuous, which by 132 implies 421. lim h(P2)=lim K2(1-p2,P2)=K1(1)=0 p2~ o p2~ o and further properties of h in 331: 43. h is increasing in (0,~), 431. h is monotonic in (0,~), 4311. h is Lebesgue integrable in 0,I~, 43111. h is Lebesgue measurable in (0,1). AbOut L in 152: ~. r is cont~,,uous in {.(pl,P2)IPI~ o, p2~o, p1'p2-!< I). About f in 31: 45. f is continuous in 0,1. For w in :23 (cid:12)9 46. w is continuous and positive in ~0,I. For g in 32, 321, 3211, 3212, I 321 bi-(~.~..x x)(x> o) 47 (cid:12)9 __ (x =0) is strictly convex in 0,I, which implies -6- 174 (cid:12)9 lim (xg (- bl .0--)x x-~O and 472. g is increasing in 0, ~ .) 48. g is continuous in 0, co .) For~ in I 34 and I 35 I 49. is differentiable in (0,1). Again these regularity properties mix with the inequalities; for instance 211 or 2311 could be also considered as regularity statements, 43, 431, 47, 472 as inequalities. CHARACTERIZATIONS C.E. Shannon 29, who was the founder of information theory, has given already the characterization consisting of properties 112, 12, 1511, 2311 and 41 for the "Shannon entropy" n (1) Kn(pl'P2'''''Pn ) = - ~Pk ib Pk (that is, 112, 12, 1511, 2311 and 41 imply (1)). A.J. Khinchin 24 has proved that (1) follows also from 112, 12, 14, 151, 21, 231, and 41. D. K. Faddeev 17 reduced the postulates further to 112, 12, 15111, 42 and showed that they imply (1) already. Further reductions of Faddeev's system of postulates characterizing (1) were done by H. Tverberg 30, who replaced 42 by 4311, D.G.Kendall 23 by 43, P.M. Lee 25 by 43111, .Z Daroczy I (16, of. R. Borges 7) f by 431 or by 421. .Z Daroczy 16 has also proved that 112, 121, 122, 33 (in parti- cular, 331 and 332) and 421 imply (1), 112, 121, 151111 and 421 imply )1( for n=2,3. Also Z. Dar~czy 13,15 (cf. N. Pintac~da 26) deduced (1) from 112, 12, ,41 152, 1531, ~. A seemingly quite different characterizatio~ which however proved to be related, was given for )I( by T.W. Chaundy - J.B. McLeod 1o: 112, 153, 31 und #5 imply (1). Here 153 cam be replaced by 1533, see .J Acz~l - Z. Dar~czy 2, or by 1531, see .Z Daroczy f 15.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.