ebook img

Gaussian and Non-Gaussian Linear Time Series and Random Fields PDF

250 Pages·2000·8.38 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Gaussian and Non-Gaussian Linear Time Series and Random Fields

Springer Series in Statistics Advisors: P. Bickel, P. Diggle, s. Fienberg, K. Krickeberg, 1. Olkin, N. Wermuth, s. Zeger Springer-Science+Business Media, LLC Springer Series in Statistics Andersen/Borgan/Gill/Keiding: Statistical Models Based on Counting Processes. Berger: Statistical Decision Theory and Bayesian Analysis, 2nd edition. Bolfarine!Zacks: Prediction Theory for Finite Populations. Borg/Groenen: Modem Multidimensional Scaling: Theory and Applications Brockwell/Davis: Time Series: Theory and Methods, 2nd edition. Efromovich: Nonparametric Curve Estimation: Methods, Theory, and Applications. FahrmeirlTutz: Multivariate Statistical Modelling Based on Generalized Linear Models. Farebrother: Fitting Linear Relationships: A History of the Calculus of Observations 1750-1900. Federer: Statistical Design and Analysis for Intercropping Experiments, Volume I: Two Crops. Federer: Statistical Design and Analysis for Intercropping Experiments, Volume II: Three or More Crops. Fienberg/Hoaglin/KruskallTanur (Eds.): A Statistical Model: Frederick Mosteller's Contributions to Statistics, Science and Public Policy. Fisher/Sen: The Collected Works ofWassily Hoeffding. Good: Permutation Tests: A Practical Guide to Resampling Methods for Testing Hypotheses. Gourieroux: ARCH Models and Financial Applications. Grandell: Aspects of Risk Theory. Haberman: Advanced Statistics, Volume I: Description of Populations. Hall: The Bootstrap and Edgeworth Expansion. HardIe: Smoothing Techniques: With Implementation in S. Hart: Nonparametric Smoothing and Lack-of-Fit Tests. Hartigan: Bayes Theory. HedayatiSloaneiStufken: Orthogonal Arrays: Theory and Applications. Heyde: Quasi-Likelihood and its Application: A General Approach to Optimal Parameter Estimation. HuetiBouvieriGruet/Jolivet: Statistical Tools for Nonlinear Regression: A Practical Guide with S-PLUS Examples. Kolen/Brennan: Test Equating: Methods and Practices. Kotz/Johnson (Eds.): Breakthroughs in Statistics Volume I. Kotz/Johnson (Eds.): Breakthroughs in Statistics Volume II. Kotz/Johnson (Eds.): Breakthroughs in Statistics Volume III. KiichlerlSlJrensen: Exponential Families of Stochastic Processes. Le Cam: Asymptotic Methods in Statistical Decision Theory. Le CamIYang: Asymptotics in Statistics: Some Basic Concepts. Longford: Models for Uncertainty in Educational Testing. Miller, Jr.: Simultaneous Statistical Inference, 2nd edition. MostellerlWallace: Applied Bayesian and Classical Inference: The Case of the Federalist Papers. ParzenITanabeiKitagawa: Selected Papers of Hirotugu Akaike. Politis/Romano/Wolf: Subsampling. (continued afteri ndex) Murray Rosenblatt Gaussian and Non-Gaussian Linear Time Series and Random Fields " Springer Murray Rosenblatt Department of Mathematics University of California, San Diego La JoIla, CA 92093-0112 USA Library of Congress Cataloging-in-Publieation Data Rosenblatt, Murray. Gaussian and non-Gaussian linear time series and random fields / Murray Rosenblatt. p. em. - (Springer series in statistics) Includes bibliographieal referenees and index. ISBN 978-1-4612-7067-6 ISBN 978-1-4612-1262-1 (eBook) DOI 10.1007/978-1-4612-1262-1 1. Time-series analysis. 2. Random fields. 3. Gaussian proeesses. I. Title. II. Series. QA280.R667 2000 519.5'5-de21 99-42811 Printed on acid-free paper. © 2000 Springer Science+Business Media New York Originally published by Springer-Verlag New York, Inc.in 2000 All rights reserved. This work may not be translated or copied in whole or in part without the written permission ofthe publisher (Springer Science+Business Media, LLC), except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dis similar methodology now known or hereafter developed is forbidden. The use of general descriptive names, trade names, trademarks, etc., in this publication, even ifthe former are not especially identified, is not to be taken as a sign that such names, as understood by the Trade Marks and Merchandise Marks Act, may accordingly be used freely by anyone. Production managed by Timothy Taylor; manufacturing supervised by Joe Quatela. Photoeomposed eopy prcpared from the author's TEX files. 987654321 ISBN 978-1-4612-7067-6 With distant but warm memories of Mark Kac and Emil Post Preface Much of this book is concerned with autoregressive and moving av erage linear stationary sequences and random fields. These models are part of the classical literature in time series analysis, particularly in the Gaussian case. There is a large literature on probabilistic and statistical aspects of these models-to a great extent in the Gaussian context. In the Gaussian case best predictors are linear and there is an extensive study of the asymptotics of asymptotically optimal esti mators. Some discussion of these classical results is given to provide a contrast with what may occur in the non-Gaussian case. There the prediction problem may be nonlinear and problems of estima tion can have a certain complexity due to the richer structure that non-Gaussian models may have. Gaussian stationary sequences have a reversible probability struc ture, that is, the probability structure with time increasing in the usual manner is the same as that with time reversed. Chapter 1 considers the question of reversibility for linear stationary sequences and gives necessary and sufficient conditions for the reversibility. A neat result of Breidt and Davis on reversibility is presented. A sim ple but elegant result of Cheng is also given that specifies conditions for the identifiability of the filter coefficients that specify a linear non-Gaussian random field. viii Preface A stationary autoregressive moving average sequence is called min imum phase if the process has a one-sided linear representation in terms of the present and the past of the independent sequence gener ating the process and if there is also a one-sided linear representation for the independent sequence in terms of the present and the past of the process. This is the usual assumption in the classical literature and it can always be assumed to be the case for Gaussian autore gressive moving average sequences. The interesting new phenomena for the processes arise in the nonminimum phase non-Gaussian case. However Chapter 2 gives a discussion of parameter estimation in the Gaussian case. The same estimates (which we call quasi-Gaussian es timates) can also be used in the minimum phase non-Gaussian case though they are not asymptotically optimal. Our discussion follows much of that given in Brockwell and Davis. They give a somewhat more detailed presentation. Homogeneous Gaussian random fields are considered in Chapter 3. The presentation generally follows that of Rosanov and is focused on the interpolation problem. One of the primary objects is to lay out the difference between the structure of finite parameter Gaussian sequences and random fields. There are consequences in the character of certain types of parameter estimates as one will see later on. A modification of the method used in the one-dimensional case for parameter estimates of Gaussian autoregressive moving average schemes can be used for Gaussian schemes that are fields in a low dimensional environment. The classical approximation due to Whit tle as well as tapering are employed. This as well as related material is considered in Chapter 4. If a non-Gaussian one-dimensional scheme is minimum phase the best predictor in mean square is still linear. However, if the non Gaussian scheme is nonminimum phase, in most cases the best pre dictor in mean square is nonlinear. Some results of Rosenblatt are given here. Some particular examples are discussed in detail. An in equality comparing the mean square error of prediction for the best linear predictor and the best (possibly nonlinear) predictor based on entropy is derived. These topics are discussed in Chapter 5. Chapter 6 deals with the quasi-Gaussian likelihood (formally the expression for a possibly non-Gaussian minimum phase process com puted as if it were the likelihood of a Gaussian process). Most com putational schemes are based on maximizing this formal likelihood. Preface IX The fluctuation of the likelihood is considered as a random process. The results do give some insight into the moderate sample behavior of estimates based on this likelihood. These results are due to Michael Kramer (his Ph.D. thesis at University of California, San Diego) and the derivation is basically that given in his unpublished thesis. Chapter 7 introduces concepts that relate to random fields of a possibly non-Gaussian character. Markov fields and Markov chains are considered. A limit theorem on entropy that has some relevance in the discussion of maximum likelihood for non-Gaussian autore gressive schemes is derived. In the final chapter approximate maximum likelihood is discussed for nonminimum phase autoregressive and autoregressive moving av erage schemes under the assumption that the density of the indepen dent random variables generating the scheme is known and satisfies appropriate smoothness and positivity conditions. Asymptotic re sults of Lii and Rosenblatt are presented. A nonp arametric scheme suggested by Wiggins and commented on by Donoho and Gassiat is developed. Finally a simple example of what might be termed su perefficiency is given. This is an example of what can occur when one has additional information of consequence beyond that usually available. Various of the questions dealt with in the book relate to problems that arise in a number of applied areas-questions of prediction, parameter estimation, and deconvolution. They have interest from both theory and application. Deconvolution problems of this type arise in certain types of seismic investigations (Wiggins 1978 and Donoho 1981). Related questions arise in "speckle masking" in as tronomy where methods are used to overcome the degradation of telescopic images caused by atmospheric turbulence (Lohman et ai. 1983). There are quite a number of open questions even for this apparently simple class of models. Remarks are also made in notes relating questions here to those on nonlinear representations of sta tionary sequences. It is worthwhile to mention how this book compares to the excel lent treatises of Brockwell and Davis 1991 on the one hand and Tong 1990 on the other. Brockwell and Davis discuss the classical linear models of time series analysis and hence models which are basically minimum phase. Tong's book is concerned with simple but inter esting nonlinear models from a dynamical system perspective. This x Preface dynamical system p~rspective is what we might consider a nonlinear orientation analogous to the minimum phase conditions. As already noted, the novelty here is that of nonminimum phase processes, a structure that is especially persuasive in the case of random fields of dimension two or higher. Some references to related literature are given in the text but more extensive referencing is to be found in the notes at the end of the book. However, the referrals cannot claim to be complete. The names of Mark Kac and Emil Post are noted in an affection ate memory of undergraduate and graduate school. I had taken a course in real variables with Post at City College of New York as an undergraduate and as noted in Martin Davis' introduction to The Collected Works of Emil Post 1994 it could be a terse and tense set of sessions. To my amazement a reading course I later took with him based on de la Vallee Poussin's Integrales de Lebesgue 1916 was re laxed and insightful with him shown as a rather warm human being. Mark Kac was my advisor when I was working on a doctoral topic at Cornell University later on. Though I took most of my courses in probability theory and statistics with William Feller, an entertaining and at times amusingly dogmatic lecturer, I am thankful that I had Mark as an advisor. He suggested a thesis topic, was available for consultation, but didn't press too firmly and left one to make one's own way. He was a delightful person, with great power as an analyst and strong interests in statistical physics. I feel that lowe much to both Kac and Post as well as I can recall through the filter of time. Last of all I should like to thank Richard Bradley, Richard Davis, Peter Lewis, and Keh-shin Lii who read through parts of the manu script and made helpful comments-and to Judy Gregg who typed the manuscript and made many sensible suggestions. I appreciate the assistance of the University of California, San Diego in making a grant that supported the typing of the manuscript. La Jolla, California 1999 Murray Rosenblatt Contents Preface vii 1 Reversibility and Identifiability 1 1.1 Linear Sequences and the Gaussian Property 1 1.2 Reversibility .................. . 4 1.3 Identifiability . . . . . . . . . . . . . . . . . . 8 1.4 Minimum and Nonminimum Phase Sequences 10 2 Minimum Phase Estimation 15 2.1 The Minimum Phase Case and the Quasi-Gaussian Likelihood. . . . . . . . . . . 15 2.2 Consistency.......... 18 2.3 The Asymptotic Distribution 22 3 Homogeneous Gaussian Random Fields 27 3.1 Regular and Singular Fields . . 27 3.2 An Isometry. . . . . . . . . . . 29 3.3 L-Fields and L-Markov Fields. 34 4 Cumulants, Mixing and Estimation for Gaussian Fields 41 4.1 Moments and Cumulants ................ 41

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.