ebook img

Filtering and Control of Random Processes: Proceedings of the E.N.S.T.-C.N.E.T. Colloquium Paris, France, February 23–24, 1983 PDF

330 Pages·1984·3.4 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Filtering and Control of Random Processes: Proceedings of the E.N.S.T.-C.N.E.T. Colloquium Paris, France, February 23–24, 1983

Lecture Notes ni Control and noitamrofnI Sciences Edited by .V.A Balakrishnan dna M.Thoma 16 gniretliF and Control of Random Processes fo sgnideecorP eht .T.E.N.C-.T.S.N.E Colloquium ,siraP February France, 23-24, 1983 Edited by .H .Korezlioglu, G Mazziotto, dna .I. Szpirglas galreV-regnirpS Berlin Heidelberg New York Tokyo 1984 Series Editors A.V. Balakrishnan • M. Thoma Advisory Board L. D. Davisson • A. G. .J MacFarlane • H. Kwakernaak .J L. Massey • Ya Z. Tsypkin • A. .J Viterbi Editors Hayri Korezlioglu E.N.S.T. 46, Rue Barrault 75634 Paris Cedex 13 France G6rald Mazziotto Jacques Szpirglas C.N.E.T.-PAA/TIM/MTI 38-40, Rue du Gen6ral Leclerc 92131 Issy les Moulineaux France Library of Congress Cataloging in Publication Data E.N.S.T.-C.N,E.T. Colloquium (1983: Paris, France) Filtering and control of random processes. (Lecture notes in control and information sciences; 61) .1 Control theory--Congresses. 2. Stochastic processes--Congresses. 3. Filters (Mathematics) -- Congresses. .I Korezlioglu, H. (Hayri). ,II Mazziotto, G.(Gerald). .1II Szpirglas, .J (Jacques) .VI Ecole nationale sup~rieure des tel~communlcations (France) .V Centre national d'~ttudes des t~l~communications (France) VI. Title. VII. Series. QA402.3.E15 1983 519.2 84-1420 AMS Subject Classifications (1980): 60 G 35 - 60 G 40 - 93 Ell - 93 E20 ISBN 3-540-13270-8 Springer-Verlag Berlin Heidelberg NewYork Tokyo ISBN 0-387-13270-8 Springer-Verlag NewYork Heidelberg Berlin Tokyo This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically those of translation, reprinting, re-use of illustrations, broadcasting, reproduction by photocopying machine or similar means, and storage in data banks. Under § 54 of the German Copyright Law where copies are made for other than private use, a fee is payable to "Verwertungsgesellschaft Wort ,~ Munich. © Springer-Verlag Berlin, Heidelberg 1984 Printed in Germany Offsetprinting: Mercedes-Druck, Berlin Binding: U.ideritz und Bauer, Berlin 206113020-543210 DROWEROF The present volume englobes the papers presented at the ENST-CNET Colloquium on "Filtering and Control of Random Processes" held in Paris on 23-24 February 1983 and sponsored by the Centre National d'E~udes des T~l~- communications (CNETI and the Eeole Nationale Sup~rieure des T~communications (ENST). The papers cover the following areas: diffusion pro- cesses in bounded regions and their control; approxima- tions for diffusion processes, for their filtering and control; study of the unnormalized filtering equation~ control of partially observed diffusions; stochastic games; optimal stopping; and different topics related to the subject. Many of the papers overlap several of these areas. Thinking that a classification by subject would seem artificial, we have chosen to present them in the alpha- betic order of the authors' names. eW would like to express our acknowledgement o~ the CNET and the ENST. Particular thank~ go to J. EL MEZEC, .M URIEN, B. AYRAULT and C. NEUGEUG for their encourage- ment and material support. H. KOREZLIOGLU, G. MAZZIOTTO, J. SZPIRGLAS. TABLE OF CONTENTS J. AGUILAR-MARTIN : Projective Markov processes .................. I A. BENSOUSSAN : On the stochastic maximum priciple for infinite dimensional equations and applications to the control of Zakai equation ......................................................... 13 R.K. BOEL : Some comments on control and estimation problems for diffusions in bounded regions ................................. 24 N. CHRISTOPEIT and K. HELMES : The separation principle for partially observed linear control systems: a general framework .... 36 G.B. DI MASI and W.J. RUNGGALDIER : Approximations for discrete time partially observable stochastic control problems ............ 61 M. HAZEWINKEL , S.I. MARCUS and H.J. SUSSMANN : Nonexistence of finite dimensional filters for conditional statistics of the cubic sensor problem ......................................... 76 D.P. KENNEDY : An extension of the prophet inequality ............ 104 H. KOREZLIOGLU and C. MARTIAS : Martingale representation and nonlinear filtering equation for distribution-valued processes. ..111 J.P. LEPELTIER and M.A. MAINGUENEAU : Jeu de Dynkin avec coot d~pendant d'une strat~gie continue ............................... 138 P.L. LIONS : Optimal control of reflected diffusion processes .... 157 E. MAYER-WOLF and M. ZAKAI : On a formula relating the Shannon information to the Fisher information for the filtering problem...164 G. MAZZIOTTO : Optimal stopping of bi-Markov processes ........... 172 E. PARDOUX : Equations du lissage non lin~aire ................... 206 J. PICARD : Approximation of nonlinear filtering problems and order of convergence. • ........................................... 219 G. PICCI and J.H. VAN SCHUPPEN : On the weak finite stochastic realization problem. . ............................................ 237 M. PONTIER and J. SZPIRGLAS : Controle lin~aire sous contrainte avec observation partielle. . ..................................... 243 C. STRICKER : Quelques remarques sur les semimartingales gaussiennes et le probl~me de l'innovation ....................... 260 J. SZPIRGLAS : Sur les propri~t~s markoviennes du processus de filtrage ......................................................... 277 D. TALAY : Efficient numerical schemes for the approximation of expectations of functionals of the solution of a S.D.E., and applications ................................................. 294 A.S. USTUNEL : Distributions-valued semimartingales and appli- cations to control and filtering ................................. 314 PROJECTIVE VOKRAM PROCESSES J. AGUILAR-MARTIN Laboratoire d'Automatique et d'Analyse des Syst~mes du C.N.R.S. ,7 avenue du Colonel Roche 3]400 TOULOUSE, France .O GENERAL COM/qENTS We shall give here tlle fundamentals of what could be called "optimal poly- nomial regression", that is the orthogonal projection of a given random variable on the space of polynomial combinations of a group of possibility observable random variables. The vectorial case will be at once studied and therefore we need to use tensorial contracted notation (or Einstein's con- vention). The optimal polynomial regression estimator or, shortly, polynomial estima- tor is a mere extension of the well known linear least squares estimator; and similar to Doob [19533 we shall define a Markov property based on the independence of the past conditionnally to the present, giving rise to wide sense N-polynomial projective Markov processes or Projective Markov Processes in the N-polynomial sense (PMPN). The special case N=2 : Projective Markov Process in the Quadratic sense, (PMPQ), will be given special attention. It gives an usefu] dynamical model for diffusion processes encountered frequently when f]ows interact, as in thermic, biological or ecological processes. ,I POLYNOMIAL ESTIMATION ].I Basic definition and theorems Let ~ be a space of square integrable random variables defined on ,/(_( ~ , P), and ~n be the space of n-dimensional random vectors, the components of which are in ~ . We shall distinguish between a collection of possibly observable random m variables ~X.~ , .X ~ ~_~ ni and the random variable upon which the I)~=i i nA, estimation deal~, YC~ . We shall recall here two well known fundamental results on probabilistic estimation. MEROEHT I : Optimality of conditional expectation Let us consider the measurable functions F ( X such tha.t FeE z [(52, ~x' p)' ~n] where x~ is the ~- geDra generated by {Xit. For any ~ N n. E [(XT(y-F)) 2] > E [ (xT (y_~))2] where YRALLOROC 1 : S t..ochastic orthogonality_of estimation error On the same conditions as in the previous theorem E [ (Y-Y) T ] V = 0 Therefore Y is the orthogonal projection of Y on L2[(/~ , ~x, p)' ~n] Proofs of theorem 1 and its corollary can be found in all elementary books on probability. DEFINITION 1 Let N H be the Hilbert space of all their n- valued polynomial functions of degree up to N of the components of X i }m i= I (It is supposed that these po- lynomial functions belong to 6n). LEb~ 1 Let us denote by H the closure of U N. H We suppose that there is a positive N~0 number a such that for all A ~ ~n of norm not greater than a f ~ IATx} dP <~o ......... (G~). Then H = 2 L [(_(i, ~x, p), ~n] Proof : Condition o (G )o implies that the characteristic function mn )A( = foe i ATX dP, A6 is such that for each A, the function ~C (tA) is an analytic function of t~ in a certain neighborhood of zero. According to a lemma of GIH~N SKOROHOD [197~ , the mentioned equality of spaces holds. ]ZC We shall state later a fundamental limit theorem concerning the polynomial estimators. We can write the random vector N N 6 H Z as a linear combination of tensor pro- ducts; so its ~th component will be m N X~ lJ rJ ~n N z = E ~ ( ) ... x k=1 rl,...,rk=l jl...jr Xrl r~ The orthogonality strictly written involves scalar product. It is convenient to widen this concept to tensor product in order to be able to develop sim- pler equations. 1,20rthogonality and projections THEOREM 2 Let Z N~ N H and W~ ~.n the following propositions are equivalent A) W is orthogonal to HN, that is, V ZN~_HN, E [W T ZN~ = 0 (or E[w i z~]= )O B) W is such that E [IV ~ Z N] = 0 (or E[w i zJ] = )0 Proof : B is equivalent to E . ]: 0 for all I r ... k = r I * m and k = ---- I N Let i e be the canonical basis of n, R then, for all i =ii --~ n,.jo = I --~ np p=1 --~ m. and as I (Xr~ ... xrk 3k- ) I e . is a basis of H .N Therefore W is orthogonal to N. H D This theorem leads us to a constructive theorem for the optimal polynomial estimator analogous to the Wiener Hopf equation in linear estimation. THEOREM 3 : Generalized Wiener Hopf e~u_ation The optimal N-polynomial estimation of Y based on the observation of {Xil is iN, orthogonal projection of Y on N : H for all N N 6 H Z it is such that. E [(Y- i N)® z N]= o . . . . . . . . . . . . . ~*) Therefore if it exists, it is solution of the following system of equations E [Y ® Xrl . .. ® . Xrk .] -- .E [i ® Xrl ® Xrk ]''" (*,~) I r ... k = r I,... m k=1,...., N 4 Proof : A well known result that can be found in all elementary books on geometry, states that the orthogonal projection minimizes all quadratic errors, similarly to theorem 1. Equation (**) follows from Theorem 2 then we can write it for the tensor components defining N ~rl...rk j J m k y = .... x ... k=l ri,...,rk=t Jt 3m k rl xrk J, " N ej ~r r k E Ix " i 1 i m E ..., -Yg XSl °k ~ 1 " Jm k L r 1 k r s 1 k s Equation (***) is a square system of equation for ~ and therefore its solu- tion depends only on the moments up to order N+I between Y and fXi} and up to 2N b~twee~ {Xit. We shall use the notation NY = N-proj YL 1 I X ... X m] 1.3 Limit property THEOREM 4 Let us consider the sequence of spaces ~,IHN~ as well as the sequence of projectionslYN~'- and assume that condition G~ holds. Then lim IIY - NY H = 0 N.~ 2 Proof. Lemma I states that we may define the estimation error as 8=Hy-~llz =inf II Y - z }l 2 Z~H Then d= til inflIY-YNII2 and as (Y-Y) is orthogonal to (Y-YN) by N--~ construction of YN therefore ]] Y - YN]I2 z - =llY + ;11~ - YN]I2 II; ^ 2 and the result follows Remarks : °, )I Linear projection is Y1 and gives the well known linear least squares estimator. )2 Wide sense Markov processes in Doob ff1953] derive from a Markov proper- ty that will be written here as : 1-proj [Xt+ 1 IXt, Xt_ 1 ... Xo] = 1 -proj [Xt+ 1 I t X ] This property involves second order conditions on the sequence {Xi]. If. PROJECTIVE MARKOV PROCESSES IN THE N-POLYNOMIAL SENSE (PMPN) II.I General definition DEFINITION 2 Let X(t) 6~ n be a discrete time random process for t = 1,2,..., T, such that all moments up to order 2N are finite. IX(t)l is a PMPN if an only if [x~t~ ~ ...~ x(t~l Ix(~)t , 3 N vroj ~. m.~, r s = N-proj IX(t) @ ~,.m.J ... ~ X(t) ] X(s)~ for m = 1,2,...,N We shall notice here that the projective markov condition involves not only the process itself, m=l, but all its polynomial combinations up to degree N. The "state" of a PMPN is not X(t) but the augmented vector. ~--N-,j sT(t) = Ix(t) .... , X(t)® ... ®X(t)] And any recursive model, or internal representation of it shall be a recur- sive vector equation on S(t). II.2 Projective markov processes in the ~uadratic sense (PMPQ) From now on we shall deal with N=2, that is PMPQ's and we shall use tensor component notation. DEFINITION 3 n-Vector valued random process [x { )t( ]~ =0 is a PMPQ if an only if 2-Proj [xe(t) I {xJ(r)}r~<s j=l,z~...,n J = ~-Proj [~e(~] I fxJ(s)tj=,,~,.., n 3 and ~-Proj [~ ~ (t) xC(t) I {~J(r)t ~--s ] j=l,2,...,n xC(t)] [xJ(~) t j=~,z,...,n-] = 2-Proj ~_x'e(t) Let us denote as follows the moments up to the 4th order.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.