ebook img

Seminar on Empirical Processes PDF

117 Pages·1987·2.371 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Seminar on Empirical Processes

DMVSeminar Band9 Springer Basel AG P. Gaenssler W. Stute Seminar on Empirical Processes 1987 Springer Basel AG Authors Peter Gaenssler Winfried Stute Mathematisches lnstitut Mathematisches Institut Universitiit Miinchen Universitiit Giessen Theresienstrasse 39 Amdstrasse 2 D-8000 Miinchen 2 D-6300 Giessen The seminar was made possible through the support of the Stiftung Volkswagenwerk. CIP-Kurztitelaufnahme der Deutschen Bibtiothek Gaenssler, Peter: Seminar on Empirical Processes I P. Gaenssler ; W. Stute. -Basel ; Boston : Birkhiiuser, 1987. (DMV-Seminar ; Bd. 9) NE: Stute, Winfried:; Seminar on Empirical Processes <1985, Dusseldorf>; Deutsche Mathematiker-Vereinigung: DMV-Seminar All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the copyright owner. © Springer Basel AG 1987 Originally published by Birkhliuser Verlag Basel in 1987 ISBN 978-3-7643-1921-2 ISBN 978-3-0348-6269-1 (eBook) DOI 10.1007/978-3-0348-6269-1 CONTENTS I. Foundations 1 II. Local and global structure of empirical processes 11 III. Goodness of fit 17 IV. Conditional empirical processes 40 V. Copula processes 49 VI. Empirical processes for censored data 58 VII. Parameter estimation in smooth empirical processes 75 VIII. Bootstrapping 82 IX. Vapnik-Chervonenkis-theory 87 List of Symbols 110 These notes are based on lectures given in the "Seminar on Empirical Processes" held at Schloss Mickeln, DUSseldorf, from September 8-13, 1985. According to the intention of the DMV seminar series, we organized the material so as to give (as we hope) a fresh approach to empirical processes thereby elaborating some of the main streams of the theory, being aware that this is necessarily very subjective. We are grateful to the DUsseldorf people for their assistance and hospitality during this wonderful week. Many thanks also to the participants of the seminar for all their interest and the lively discussions. Finally, we heartily thank Mrs. Lenk (Giessen), who was responsible for the excellent layout of the present booklet. P. Gaenssler, W. Stute 1 I. Foup4ations. Throughout this text we shall assume that t1.t2, ••• ,tn'''' is a finite or infinite sequence of independent identically distributed (i.i.d.) random elements in some sample space~ , defined on some probability space (a,A,P). In most cases~ will be an Euclidean space Rk. For such t•s, write k F(t) = P(t~t) , t E R , for the corresponding (unknown) distribution function (d.f.). The best studied non-parametric estimate of F is, for sample size n eft , the empirical d.f. In other words, Fn is the d.f. of the empirical measure n -1~ ~n = n ~ot. i=l 1 where ox is the Dirac-measure in x e ~ • The theory of empirical d.f.•s is best elaborated fork= 1, i.e. for real data. In this Foundation section we shall recall some basic facts about univariate empirical d.f.•s. See Gaenssler apd Stute (1979) and Shorack and Wellner (1986) for more details. Roughly speaking, Fn may be viewed either as (a) a discrete random measure or as 2 (b) a stochastic process with nondecreasing paths taking their values in the space D[-~.~] of all right-continuous left-hand limits functions on [-~.~] (cf. Pollard (1984)). Viewed as a measure we may compute integrals w.r.t.F ~e.g., for a given score n function 1', n J1'(t)Fn(dt) = n-1~1'(ti) i=1 For integrable 1', f1'(t)Fn(dt) ~ f1'(t)F(dt) with probability one. Integrating a function 1' = 1'(x, y) of two variates, say, w. r. t. the product of F with n itself, we get J 1'(x,y)F (dx)F (dy) = n-2 ~ 1'(t .• t.) n n ~ 1 J sn 1~i, j which is closely related to a U-statistic. Under (b) we may investigate inverses of Fn inflt:Fn(t)~ul, 0 < u < 1 For u = i/n, F~1Ci/n) = ti:n' the i-th order statistic of the sample. On the other hand, evaluating F at the data, we obtain ranks: n In summary, we see that many basic statistics may be easily expressed in terms of F n From elementary probability theory there are lots of finite sample and asymptotic results which apply to empirical distribution functions. First of 3 all (1.1) F(t) i.e. Fn(t) is an unbiased estimator for F(t). Furthermore, (1.2) Var(F (t)) = n-1F(t) (1-F(t)) n which implies Fn (t) -+ F(t) in quadratic mean. By the strong law of large numbers (1.3) F(t) with probability one. The Glivenko-Cantelli theorem asserts that even uniform convergence holds. Finally, the CLT yields a (t) n112[Fn (t)-F(t)) -+ N[O,F(t) (1-F(t))] n in distribution. The collection of random variables lan(t):tERI is the empirical process of sample size n ~ 1. Donsker•s invariance principle states that a -+ B0oF in distribution in the spaceD[-~.~] n where B0 is a Brownian Bridge. For finite n, the exact distribution of nFn(t)

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.