ebook img

Best Approximation in Inner Product Spaces PDF

343 Pages·2001·12.51 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Best Approximation in Inner Product Spaces

Canadian Mathematical Society Societe mathematique du Canada Editors-in-Chief Redacteurs-en-chef Jonathan Borwein Peter Borwein Springer New York Berlin Heidelberg Barcelona Hong Kong London Milan Paris Singapore Tokyo CMS Books in Mathematics Ouvrages de mathematiques de la SMC 1 HERMAN/KuCERAfSIMSA Equations and Inequalities 2 ARNOLD Abelian Groups and Representations of Finite Partially Ordered Sets 3 BORWEIN/LEWIS Convex Analysis and Nonlinear Optimization 4 LEVIN/LuBINSKY Orthogonal Polynomials for Exponential Weights 5 KANE Reflection Groups and Invariant Theory 6 PHILLIPS Two Millennia of Mathematics 7 DEUTSCH Best Approximations in Inner Product Spaces Frank Deutsch Best Approximation in Inner Product Spaces Springer FFrraannkk DDeeuuttsscchh DDeeppaarrttmmeenntt ooff MMaatthheemmaattiiccss PPeennnnssyyllvvaanniiaa SSttaattee UUnniivveerrssiittyy 441177 MMccAAlllliisstteerr BBllddgg.. UUnniivveerrssiittyy PPaarrkk,, PPAA 1166880022--66440011 UUSSAA ddeeuuttsscchh@@mmaatthh..ppssuu..eedduu EEddiittoorrss--iinn--CChhiieeJJ RReeddaacctteeuurrss--eenn--cchheeJJ JJoonnaatthhaann BBoorrwweeiinn PPeetteerr BBoorrwweeiinn CCeennttrree ffoorr EExxppeerriimmeennttaall aanndd CCoonnssttrruuccttiivvee MMaatthheemmaattiiccss DDeeppaarrttmmeenntt ooff MMaatthheemmaattiiccss aanndd SSttaattiissttiiccss SSiimmoonn FFrraasseerr UUnniivveerrssiittyy BBuurrnnaabbyy,, BBrriittiisshh CCoolluummbbiiaa VV55AA IISS66 CCaannaaddaa MMaatthheemmaattiiccss SSuubbjjeecctt CCllaassssiiffiiccaattiioonn ((22000000)):: 4411AA5500,, 4411AA6655 LLiibbrraarryy ooff CCoonnggrreessss CCaattaallooggiinngg--iinn--PPuubblliiccaattiioonn DDaattaa DDeeuuttsscchh,, FF.. ((FFrraannkk)),, 11993366-- BBeesstt aapppprrooxxiimmaattiioonn iinn iinnnneerr pprroodduucctt ssppaacceess 11 FFrraannkk DDeeuuttsscchh.. pp.. ccmm.. -- ((CCMMSS bbooookkss iinn mmaatthheemmaattiiccss ;; 77)) IInncclluuddeess bbiibblliiooggrraapphhiiccaall rreeffeerreenncceess aanndd iinnddeexx.. 11.. IInnnneerr pprroodduucctt ssppaacceess.. 22.. AApppprrooxxiimmaattiioonn tthheeoorryy.. 11.. TTiittllee.. IIII.. SSeerriieess.. QQAA332222..44 ..DD4488 22000011 551155'' ..773333--ddcc2211 0000--004477009922 PPrriinntteedd oonn aacciidd--ffrreeee ppaappeerr.. IISSBBNN 997788--11--44441199--22889900--00 IISSBBNN 997788--11--44668844--99229988--99 ((eeBBooookk)) DDOOII 1100..11000077//997788--11--44668844--99229988--99 ©© 22000011 SSpprriinnggeerr--VVeerrllaagg NNeeww YYoorrkk,, IInncc.. SSooffttccoovveerr rreepprriinntt ooff tthhee hhaarrddccoovveerr 11ss tt eeddiittiioonn 22000011 AAllll rriigghhttss rreesseerrvveedd.. TThhiiss wwoorrkk mmaayy nnoott bbee ttrraannssllaatteedd oorr ccooppiieedd iinn wwhhoollee oorr iinn ppaarrtt wwiitthhoouutt tthhee wwrriitttteenn ppeerrmmiissssiioonn ooff tthhee ppuubblliisshheerr ((SSpprriinnggeerr--VVeerrllaagg NNeeww YYoorrkk,, IInncc..,, 117755 FFiifftthh AAvveennuuee,, NNeeww YYoorrkk,, NNYY 1100001100,, UUSSAA)),, eexxcceepptt ffoorr bbrriieeff eexxcceerrppttss iinn ccoonnnneeccttiioonn wwiitthh rreevviieewwss oorr sscchhoollaarrllyy aannaallyyssiiss.. UUssee iinn ccoonnnneeccttiioonn wwiitthh aannyy ffoorrmm ooff iinnffoorrmmaattiioonn ssttoorraaggee aanndd rreettrriieevvaall,, eelleeccttrroonniicc aaddaappttaattiioonn,, ccoommppuutteerr ssooffttwwaarree,, oorr bbyy ssiimmiillaarr oorr ddiissssiimmiillaarr mmeetthhooddoollooggyy nnooww kknnoowwnn oorr hheerreeaafftteerr ddeevveellooppeedd iiss ffoorrbbiiddddeenn.. TThhee uussee ooff ggeenneerraall ddeessccrriippttiivvee nnaammeess,, ttrraaddee nnaammeess,, ttrraaddeemmaarrkkss,, eettcc..,, iinn tthhiiss ppuubblliiccaattiioonn,, eevveenn iiff tthhee ffoorrmmeerr aarree nnoott eessppeecciiaallllyy iiddeennttiiffiieedd,, iiss nnoott ttoo bbee ttaakkeenn aass aa ssiiggnn tthhaatt ssuucchh nnaammeess,, aass uunnddeerrssttoooodd bbyy tthhee TTrraaddee MMaarrkkss aanndd MMeerrcchhaannddiissee MMaarrkkss AAcctt,, mmaayy aaccccoorrddiinnggllyy bbee uusseedd ffrreeeellyy bbyy aannyyoonnee.. PPrroodduuccttiioonn mmaannaaggeedd bbyy MMiicchhaaeell KKooyy;; mmaannuuffaaccttuurriinngg ssuuppeerrvviisseedd bbyy JJeeffffrreeyy TTaauubb.. PPhhoottooccoommppoosseedd ccooppyy pprreeppaarreedd uussiinngg tthhee aauutthhoorr''ss AAMMSS--''IIEEXX ffiilleess.. 99 88 776655 443322 11 SSPPIINN 1100778844558833 SSpprriinnggeerr--VVeerrllaagg NNeeww YYoorrkk BBeerrlliinn HHeeiiddeellbbeerrgg AA mmeemmbbeerr 0011 BBeerrtteellssmmaannnnSSpprriinnggeerr SScciieennccee++BBuussiinneessss MMeeddiiaa GGmmbbHH To Mary: a wonderful wife, mother, and mema An approximate answer to the right problem is worth a good deal more than an exact answer to an approximate problem. -John Thkey What is written without effort is in general read without pleasure. -Samuel Johnson PREFACE This book evolved from notes originally developed for a graduate course, "Best Approximation in Normed Linear Spaces," that I began giving at Penn State Uni versity more than 25 years ago. It soon became evident. that many of the students who wanted to take the course (including engineers, computer scientists, and statis ticians, as well as mathematicians) did not have the necessary prerequisites such as a working knowledge of Lp-spaces and some basic functional analysis. (Today such material is typically contained in the first-year graduate course in analysis.) To accommodate these students, I usually ended up spending nearly half the course on these prerequisites, and the last half was devoted to the "best approximation" part. I did this a few times and determined that it was not satisfactory: Too much time was being spent on the presumed prerequisites. To be able to devote most of the course to "best approximation," I decided to concentrate on the simplest of the normed linear spaces-the inner product spaces-since the theory in inner product spaces can be taught from first principles in much less time, and also since one can give a convincing argument that inner product spaces are the most important of all the normed linear spaces anyway. The success of this approach turned out to be even better than I had originally anticipated: One can develop a fairly complete theory of best approximation in inner product spaces from first principles, and such was my purpose in writing this book. Because of the rich geometry that is inherent in inner product spaces, most of the fundamental results have simple geometric interpretations. That is, one can "draw pictures," and this makes the theorems easier to understand and recall. Several of these pictures are scattered throughout the book. This geometry also suggests the important role played by "duality" in the theory of best approximation. For example, in the Euclidean plane, it is very easy to convince yourself (draw a picture!) that the distance from a point to a convex set is the maximum of the distances from the point to all lines that separate the point from the convex set. This suggests the conjecture (by extrapolating to any inner product space) that the distance from a point to a convex set is the maximum of the distances from the point to "hyperplanes" that separate the point from the set. In fact, this conjecture is true (see Theorem 6.25)! Moreover, when this result is formulated analytically, it states that a certain minimization problem in an inner product space has an equivalent formulation as a maximization problem in the dual space. This is a classic example of the role played by duality in approximation theory. Briefly, this is a book about the "theory and application of best approximation in inner product spaces" and, in particular, in "Hilbert space" (i.e., a complete in ner product space). In this book geometric considerations playa prominent role in developing and understanding the theory. The only prerequisites for reading it are some "advanced calculus" and a little "linear algebra," where, for the latter subject, viii PREFACE a first course is more than sufficient. Every author knows that it is impossible to write a book that proceeds at the correct pace for every reader. It will invariably be too slow for some and too fast for others. In writing this book I have tried to err on the side of including too much detail, rather than too little, especially in the early chapters, so that the book might prove valuable for self-study as well. It is my hope that the book proves to be useful to mathematicians, statisticians, engi neers, computer scientists, and to any others who need to use best approximation principles in their work. What do we mean by "best approximation in inner product spaces"? To explain this, let X be an inner product space (the simplest model is Euclidean n-space) and let K be a nonempty subset of X. An element Xo in K is called a best approx imation to x from K if Xo is closest to x from among all the elements of K. That is, Ilx - xoll = inf{llx - yll lyE K}. The theory of best approximation is mainly concerned with the following fundamental questions: (1) Existence of best approx imations: When is it true that each x in X has a best approximation in K? (2) Uniqueness of best approximations: When is it true that each x in X has a unique best approximation in K? (3) Characterization of best approximations: How does one recognize which elements in K are best approximations to x? (4) Continuity of best approximations: How do best approximations in K to x vary as a function of x? (5) Computation of best approximations: What algorithms are available for the actual computation of best approximations? (6) Error of approximation: Can one compute the distance from x to K, i.e., inf{llx - yll lyE K}, or at least get good upper and/or lower bounds on this distance? These are just a few of the more basic questions that one can ask concerning best approximation. In this book we have attempted to answer these questions, among others, in a systematic way. Typically, one or more general theorems valid in any inner product space are first proved, and this is then followed by deducing specific applications or examples of these theorems. The theory is the richest and most complete when the set K is a closed and convex set, and this is the situation that the bulk of the book is concerned with. It is well known, for example, that if K is a (nonempty) closed convex subset of a Hilbert space X, then each point x in X has a unique best approximation PK(x) in K. Sets that always have unique best approximations to each point in the space are called Chebyshev sets. Perhaps the major unsolved problem in (abstract) best approximation theory today is whether or not the converse is true. That is, must every Chebyshev set in a Hilbert space be convex? If X is finite-dimensional, the answer is yes. Much of what is known about this question is assembled in Chapter 12. The book is organized as follows. In Chapter 1, the motivation for studying best approximation in inner product spaces is provided by listing five basic problems. These problems all appear quite different on the surface. But after defining inner product spaces and Hilbert spaces, and giving several examples of such spaces, we observe that each of these five problems is a special case of the problem of best approximation from a certain convex subset of a particular inner product space. The general problem of best approximation is discussed in Chapter 2. Existence and uniqueness theorems for best approximations are given in Chapter 3. In Chapter 4 a characterization of best approximations from convex sets is given, along with several improvements and refinements when the convex set K has a special form (e.g., a convex cone or a linear subspace). When K is a Chebyshev set, the mapping x >-+ PK(x) that associates to each x in X its unique best approximation in K is PREFACE ix called the metric projection onto K. In Chapter 5 a thorough study is made of the metric projection. In particular, PK is always nonexpansive and, if K is a linear subspace, PK is just the so-called orthogonal projection onto K. In Chapter 6 the bounded linear functionals on an inner product space are studied. Representation theorems for such functionals are given, and these are applied to deduce detailed results in approximation by hyperplanes or half-spaces. In Chapter 7 a general duality theorem for the error of approximation is given. A new elementary proof of the Weierstrass approximation theorem is established, and explicit formulas are obtained for the distance from any monomial to a polynomial subspace. These are then used to establish an elegant approximation theorem of Muntz. In Chapter 8 we seek solutions to the operator equation Ax = b, where A is a bounded linear operator from the inner product space X to the inner product space Y, and bEY. If X and Yare Euclidean n-space and Euclidean m-space respec tively, then this operator equation reduces to m linear equations in n unknowns. To include the possible situation when no solution x exists, we reformulate the problem as follows: Minimize II Ax - bll over all x in X. This study gives rise to the notions of "generalized solutions" of equations, "generalized inverses" of operators, etc. The theory of Dykstra's cyclic projections algorithm is developed in Chapter 9. This algorithm is an iterative scheme for computing best approximations from the intersection K = nl' Ki, where each of the sets Ki is a closed convex set, in terms of computing best approximations from the individual sets Ki. When all the Ki are linear subspaces, this algorithm is called von Neumann's method of alternating projections. It is now known that there are at least fifteen different areas of mathematics for which this algorithm has proven useful. They include linear equations and linear inequalities, linear prediction theory, linear regression, computed tomography, and image restoration. The chapter concludes with several representative applications of the algorithm. In Chapter 10 we consider the problem of best approximation from a set of the type K = C n A-1(b), where C is a closed convex set, A is a bounded linear mapping from X to Y, and bEY. This problem includes as a special case the general shape-preserving interpolation problem, which is another one of the more important applications of best approximation theory. It is shown that one can always replace the problem of determining best approximations to x from K by the (generally easier) problem of determining best approximations to a certain perturbation of x from the set C (or from a specified extremal subset of C). In Chapter 11 the general problem of (finite) interpolation is considered. Also studied are the problems of simultaneous approximation and interpolation, simul taneous interpolation and norm-preservation, simultaneous approximation, inter polation, and norm-preservation, and the relationship among these properties. The last chapter (Chapter 12) examines the question of whether every Chebyshev set in a Hilbert space must be convex. Throughout each chapter, examples and applications have been interspersed with the theoretical results. At the end of each chapter there are numerous exercises. They vary in difficulty from the almost trivial to the rather challenging. I have also occasionally given as exercises certain results that are proved in later chapters. My purpose in doing this is to allow the students to discover the proofs of some of these important results for themselves, since for each of these exercises all the necessary machinery is already at hand. Following each set of exercises there is a section x PREFACE called "Historical Notes," in which I have tried to put the results of that chapter into a historical perspective. The absence of a citation for a particular result should not, however, be interpreted as a claim that the result is new or my own. While I believe that some of the material included in this book is new, it was not always possible for me to determine just who proved what or when. Much of the material of the book has not yet appeared in book form, for example, most of Chapters 9-12. Surprisingly, I have not seen even some of the more basic material on inner product spaces in any of the multitude of books on Hilbert space theory. As an example, the well known Frechet-Riesz representation theorem states that every bounded linear functional on a Hilbert space has a "representer" in the space (Theorem 6.10), and this result is included in just about every book on Hilbert space theory. In contrast to this, not every bounded linear functional on an (incomplete) inner product space has such a representation. Nevertheless, one can specify exactly which functionals do have such representations (Theorem 6.12), and this condition is especially useful in approximation by hyperplanes (Theorem 6.17) or half-spaces (Theorem 6.31). If we had worked only in Hilbert spaces, parts of the book could have been shortened and simplified. The lengthy Chapter 6, for example, could have been considerably abbreviated. Thus we feel an obligation to explain why we spent the extra time, effort, and pages to develop the theory in arbitrary inner product spaces. In many applications of the theory of best approximation the natural space to work in is a space of continuous or piecewise continuous functions defined on an interval, and endowed with the L2-norm (i.e., the norm of a function is the square root of the integral of the square of the function). Such spaces are inner product spaces that are not Hilbert spaces. Moreover, for a variety of reasons, it is not always satisfactory to work in the Hilbert space completion of the given inner product space. For example, a certain physical problem might demand that a solution, if any exists, be a piecewise continuous function. Thus it seemed important to me to develop the theory of best approximation in any inner product space and not just in a Hilbert space. In what way does this book differ from the collection of books available today on approximation theory? In most of these books, if any best approximation theory in a Hilbert space setting was included, the sum total rarely amounted to more than one or two chapters. Here we have attempted to produce the first systematic study of best approximation theory in inner product spaces, and without elaborate prerequisites. While the choice of topics that one includes in a book is always a sub jective matter, we have tried to include a fairly complete study of the fundamental questions concerning best approximation in inner product spaces. Each of the first six chapters of the book depends on the material of the chapter preceding it. However (with the exception of Chapters 9 and 10, each of which depends upon a few basic facts concerning adjoint operators given in Chapter 8, specifically, Theorems 8.25-8.33), each of the remaining six chapters (7-12) depends only on the first six. In particular, Chapters 7-12 are essentially independent of one another, and anyone of these may be studied after one has digested the first six chapters. The reader already familiar with the basic fundamentals of Hilbert space theory can skim the first six chapters to obtain the relevant notation and terminology, and then delve immediately into any of the remaining chapters. My own experience with teaching courses based on this book has shown that in a one semester course it is possible to cover essentially all of the first seven chapters and

Description:
This book evolved from notes originally developed for a graduate course, "Best Approximation in Normed Linear Spaces," that I began giving at Penn State Uni­ versity more than 25 years ago. It soon became evident. that many of the students who wanted to take the course (including engineers, compute
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.