ebook img

Time Series Models PDF

213 Pages·2022·2.466 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Time Series Models

Lecture Notes in Statistics 224 Manfred Deistler Wolfgang Scherrer Time Series Models Lecture Notes in Statistics Volume 224 SeriesEditors PeterDiggle,DepartmentofMathematics,LancasterUniversity,Lancaster,UK ScottZeger,Baltimore,MD,USA UrsulaGather,Dortmund,Germany PeterBühlmann,SeminarforStatistics,ETHZürich,Zürich,Switzerland Lecture Notes in Statistics (LNS) includes research work on topics that are more specialized than volumes in Springer Series in Statistics (SSS). The series editors are currently Peter Bühlmann, Peter Diggle, Ursula Gather, and Scott Zeger. Peter Bickel, Ingram Olkin, and Stephen Fienberg were editors of the series for many years. Manfred Deistler · Wolfgang Scherrer Time Series Models ManfredDeistler WolfgangScherrer InstituteofStatisticsandMathematical InstituteofStatisticsandMathematical MethodsinEconomics MethodsinEconomics TUWien TUWien Vienna,Austria Vienna,Austria InstituteforStatisticsandMathematics WUWien Vienna,Austria ISSN0930-0325 ISSN2197-7186 (electronic) LectureNotesinStatistics ISBN978-3-031-13212-4 ISBN978-3-031-13213-1 (eBook) https://doi.org/10.1007/978-3-031-13213-1 MathematicsSubjectClassification:60G10,62M10,62M15,62M20,93E12,62P20,62P30,62H25 TranslationfromtheGermanlanguageedition:“ModellederZeitreihenanalyse”byManfredDeistlerand WolfgangScherrer,©SpringerInternationalPublishingAG2018.PublishedbyBirkhäuser.AllRights Reserved. ©SpringerNatureSwitzerlandAG2022 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpartof thematerialisconcerned,specificallytherightsoftranslation,reprinting,reuseofillustrations,recitation, broadcasting,reproductiononmicrofilmsorinanyotherphysicalway,andtransmissionorinformation storageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilarmethodology nowknownorhereafterdeveloped. Theuseofgeneraldescriptivenames,registerednames,trademarks,servicemarks,etc.inthispublication doesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfromtherelevant protectivelawsandregulationsandthereforefreeforgeneraluse. Thepublisher,theauthors,andtheeditorsaresafetoassumethattheadviceandinformationinthisbook arebelievedtobetrueandaccurateatthedateofpublication.Neitherthepublishernortheauthorsor theeditorsgiveawarranty,expressedorimplied,withrespecttothematerialcontainedhereinorforany errorsoromissionsthatmayhavebeenmade.Thepublisherremainsneutralwithregardtojurisdictional claimsinpublishedmapsandinstitutionalaffiliations. ThisSpringerimprintispublishedbytheregisteredcompanySpringerNatureSwitzerlandAG Theregisteredcompanyaddressis:Gewerbestrasse11,6330Cham,Switzerland Webothwanttothankourfamiliesfortheir understandingandsupport. Preface A time series consists of observations or measurements ordered in time. Thereby the information is contained not only in the observed values, but also in their ordering in time. This is contrary to the case of “classical” i.i.d statistics, where a permutation of the data leaves the results unchanged. Time series analysis is concerned with the extraction of information from time series data and hence is a part of statistics. As in statistics in general, the focus of time series analysis is on data-driven modeling, where the data are assumed to be generated by a stochastic model. In time series analysis, these models are naturally often dynamic, i.e. they describe the evolution in time of the variables considered. The models obtained in this way can be used for instance for analysis, forecasting, filtering or control. However, data-driven modeling is not the only branch of time series analysis; for instance, (not model-based) denoising of signals or extraction of features, such as hidden cycles, are important parts of time series analysis as well. Thehistoryoftimeseriesanalysis,ortobemoreprecise,thedevelopmentand application of methods of time series analysis (which transcends analysis by the “nakedeye”),goesbacktotheturnoftheeighteenthtothenineteenthcenturyand wastriggeredbythequestion,whetherdeviationsoftheorbitsofplanetsfromthe ellipticformaredetectable.ThedeviationsfromellipticorbitsdescribedbyKepler areaconsequenceofthefactthattheplanetsandthesunformamulti-bodysystem rather than a two-body system. The so-called periodogram was developed in this context. The periodogram was already in use for the analysis of business cycle data in the nineteenth century. Moving average (MA) and autoregressive (AR) processes were introduced by G. U. Yule in the 1920s. The theory of (univariate) stationaryprocesseswaspioneeredinthe1930sand1940sbyA.N.Kolmogorov, H. Cramér, N. Wiener and K. Karhunen. The multivariate case was developed in particularbyY.A.Rozanov(seeRozanov(1967)andthereferencesgiventherein). This theory still is an essential foundation for the analysis of time series. The development of time series analysis took place in a number of different fields such as econometrics, system and control theory, signal processing and statistics. The main advancements in time series analysis over the last 75 years were as follows: vii viii Preface (cid:129) The analysis of the problem of identifiability and the maximum likelihood estimation in multivariate, “structural” ARX systems pioneered by the Cowles Commission. (cid:129) Thedevelopmentofnon-parametricspectralestimationbyJ.Tukey,inparticu- lar. (cid:129) The analysis of AR and ARMA systems (ARX and ARMAX systems, respec- tively) in particular by T. W. Anderson and E. J. Hannan. The book Box and Jenkins (1976) triggered the wide-spread use of these systems in various fields ofapplications.Subsequently,ageneralizationtothemultivariatecasehasbeen carried out, as documented in the books Caines (1988), Hannan and Deistler (2012), Ljung (1999), Lütkepohl (1993) and Reinsel (1997). (cid:129) The analysis of state-space systems and the related Kalman filter, mainly by R. E. Kalman. (cid:129) The introduction and analysis of methods for order estimation, e.g. by H. Akaike and J. Rissanen. (cid:129) Overthelastthirtyyears,theanalysisofintegratedandcointegratedprocesses, i.e.ofaparticularclassofnon-stationaryprocesses,hasobtainedgreatattention in econometrics. Important contributions in this context are due to C. W. J. Granger, R. F. Engle, P. C. B. Phillips and S. Johansen. (cid:129) Models for forecasting conditional variances for risk assessment for financial timeseries,suchasARCHorGARCHmodels,wereintroducedbyR.F.Engle. (cid:129) Dynamicfactormodelshavebecomepopularinthelastdecades,sincetheycan deal with high-dimensional time series. (cid:129) The huge area of non-linear time series models and of their estimation (see, e.g. the book Pötscher and Prucha (1997)) has undergone a rapid development over the last 25 years. (cid:129) Recentlyneuralnetshavebeensuccessfullyusedfortheanalysisandprediction of time series. See e.g. the book Goodfellow et al. (2016). Thisbookbyfardoesnotcoveralltheimportantpartsoftimeseriesanalysis.We focusonmodelsfortimeseriesandinparticularonthemostimportantclassoflin- ear models. We discuss in detail weakly stationary processes as well as important subclasses,includingARandARMAprocesses.Ouranalysisfocusesonthemul- tivariate case. Both the “linear” theory of weakly stationary processes as well as lineardynamicalsystemsstillformthemainpartofthefoundationsoftimeseries analysis,despitethefactthatnon-stationarityandnon-linearityareofmajorimpor- tance. It is a specific feature of time series analysis—as opposed to other areas of statistics—that an accurate analysis of the models is important for statistical analysis in the narrow sense. We aim to provide readers with a solid basis that allows them to understand large parts of the current literature in the field of time series analysis. In a certain sense, this book is meant to convey essential core knowledge. As compared to other textbooks in this area, this book focuses on the very core of the structure theory for multivariate linear time series models. We try to give a comprehensible and detailed presentation of this material. Preface ix Primarily,thisbookaddressesmathematiciansandadvancedstudentsofmathe- matics.However,webelievethatitisequallyaccessibleandusefulforresearchers in the fields of econometrics, financial mathematics, control engineering or signal processing. Knowledge of measure- and probability theory and of linear alge- bra are prerequisites. Moreover, the reader should be familiar with the basics of functional analysis (theory of Hilbert spaces) and of complex analysis. Thesubjectmatterisbrokendownintoseveralchapters:Chap.1providesbasic definitions of (weakly) stationary processes, their embedding in the Hilbert space of square integrable random variables, as well as the definition of corresponding covariance functions; the latter, for many problems, contain essential information abouttheunderlyingstationaryprocess.Importantclassesofmodelsforstationary processes are discussed at the end of this chapter. Chapter 2 deals with the linearleast squares prediction of stationary processes. The central result here is Wold’s decomposition, which gives substantial insight into the structure of general stationary processes. While the description of stationary processes in Chaps. 1 and 2 is done in the time domain, Chap. 3 covers the frequency domain. Key results are the spec- tral representation of covariance functions and of stationary processes, both being Fourier representations. From the spectral representations, we obtain the spectral distributionfunctionandthespectraldensity,respectively,whichcontainthesame information about the underlying process as the covariance function. Due to these Fourier representations, linear dynamical transformations of stationary processes correspond to a multiplication of functions (in frequency domain) and therefore are often easier to represent and interpret. This chapter is the most challenging one in terms of mathematics and the results will be used in subsequent chapters. However, the remaining chapters are accessible even if the proofs in this chapter have been omitted in the first reading. The next Chap. 4 describes linear dynamical transformations of stationary processesinthetimeandfrequencydomain,aswellasthecorrespondingtransfor- mation of the second moments. Such linear transformations are important models for real-world systems and are used to construct classes of stationary processes suchasARandARMAprocesses.Inthiscontext,wealsodiscussthesolutionsof linear stochastic difference equations. At the end of this chapter, we will discuss the Wiener filter, which gives the best approximation, in a mean squares sense, of a stationary process by a linear transformation of a second process. Chapter5dealswithARsystemsandARprocesses,whicharethemostimpor- tant model class in time series analysis. This class allows for an arbitrary close approximation of regular processes. Estimation and prediction of these processes are very simple. Beyond the stationary case, AR systems also serve as models for integrated and co-integrated processes, which have gained great importance in econometrics. Chapter 6 treats ARMA models and ARMA processes. We show that the class ofARMAprocessesisexactlytheclassofstationaryprocesseswithrationalspec- tral density. As is the case with AR processes, any regular, stationary process can be approximated by an ARMA process with any degree of accuracy. ARMA x Preface processesaremoreflexible,whichmeansthatinmanycasesfewerparametersare neededforapproximation.However,thestructureoftheclassofARMAprocesses ismuchmorecomplexascomparedtotheARcase.Thereisaso-calledproblemof identifiability and the relationship between the population second moments of the observations and the ARMA parameters is not given by a linear system of equa- tions, contrary to the AR case. Hence, estimation of ARMA parameters (which is not discussed here) is much more difficult than in the AR case. Chapter7discussesstate-spacesystems,whichareofcentralimportanceincon- trol engineering. We will show that state-space systems with white noise inputs, like ARMA systems, describe the class of all processes with rational spectra. In this sense, ARMA and state-space systems are equivalent. Under suitable condi- tions,thesetworepresentationsleaddirectlytotheWoldrepresentation.Inthelast section of this chapter, we will discuss the Kalman filter, which is an extremely important algorithm, particularly for the approximation of the unobserved state as well as for prediction and filtering. Chapter 8 deals with linear models with exogenous variables, in particular the so-calledARXandARMAXmodelsandthecorrespondingstate-spacemodels.In addition,wediscusstheidentifiabilityofmodelswithadditionalpriorinformation on the parameters in this context. Chapter 9 presents Granger causality, which is a concept to formalize causal relations or dependencies. In the subsequent Chap. 10, the dynamic factor models are introduced. Such models are of particular importance for high-dimensional time series, since they avoid the so-called “curse of dimensionality”. The basic idea is to decompose the observed variables into a part generated by a “small” number of hidden factors and some “residual” noise. The last chapter deals with (multivariate) GARCH models which describe the conditionalvarianceandthusareoftenusedtodescriberiskinfinance.Thisisthe only chapter where non-linear time series models are treated. As mentioned above, important areas are not addressed in this book. As far as linearmodelclassesareconcerned,thebookdoesnotdeal,e.g.withnon-stationary processes(withanexceptionoftheintegratedcase),suchaslocallystationarypro- cesses or with models for functional time series. We do also not consider models for continuous-time observations, which are important for high-frequency finan- cial times series or for applications in systems and control engineering, where often the physical models are in continuous time. We do not treat the large class of non-linear models, with the exception of GARCH models. This book is restricted to the discussion of model classes and the relation between internal model parameters and the population second moments of the observations (structure theory), which is of great importance for time series anal- ysis. However, it does not cover estimation and inference in the narrow sense. In particular, we will not discuss the estimation of expected values, covariance func- tions, spectral densities, AR, ARMA or state-space systems. We also do not deal with the large field of Bayesian time series analysis.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.