ebook img

Geometric Methods and Applications For Computer Science and Engineering [draft] PDF

708 Pages·2011·3.018 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Geometric Methods and Applications For Computer Science and Engineering [draft]

Jean Gallier Geometric Methods and Applications For Computer Science and Engineering, Second Edition February28,2011 Springer Tomywife,Anne,mychildren,Mia,Philippe, andSylvie,andmygrandchildren,Bahari andDemetrius Preface This book is an introduction to fundamental geometric concepts and tools needed for solving problems of a geometric nature with a computer. Our main goal is to presentacollectionoftoolsthatcanbeusedtosolveproblemsincomputervision, robotics,machinelearning,computergraphics,andgeometricmodeling. During the ten years following the publication of the first edition of this book, optimization techniques have made a huge come back, especially in the fields of computer vision and machine learning. In particular, convex optimization and its special incarnation, semidefinite programming (SDP), are now widely used tech- niques in computer vision and machine learning, as one will verify by looking at theProceedingsofanyConferenceinthesefields.Therefore,wefeltthatitwould beusefultoincludesomematerial(especiallyonconvexgeometry)topreparethe reader for more comprehensive expositions of convex optimization, such as Boyd andVandenberghe[2],amasterlyandencyclopedicaccountofthesubject.Inpar- ticular,weaddedChapter7whichcoversseparatingandsupportinghyperplanes. We also realized that the importance of the SVD (Singular Value Decomposi- tion)andofthepseudo-inversehadnotbeensufficientlystressedinthefirstedition of this book and we rectified this situation in the second edition. In particular, we addedsectionsonPCA(PrincipalComponentAnalysis)andonbestaffineapproxi- mationsandshowedhowtheyareefficienltycomputedusedSVD.Wealsoaddeda sectiononquadraticoptimizationandasectionontheSchurcomplementshowing theusefulnessofthepseudo-inverse. Inthissecondedition,manytyposandsmallmistakeshavebeencorrected,some proofshavebeenshortened,someproblemshavebeenadded,andsomereferences havebeenadded.Hereisalistcontainingbriefdescriptionsofthechaptersthathave beenmodifiedoradded. (cid:15) Chapter 3 on the basic properties of convex sets has been expanded. In partic- ular, we state a version of Carathe´odory’s theorem for convex cones (Theorem 3.2),aversionofRadon’stheoremforpointedcones(Theorem3.6),stateTver- berg’s theorem (Theorem 3.7) and we define centerpoints and prove their exis- tence(Theorem3.9). vii viii Preface (cid:15) Chapter 7 is new. This chapter deals with separating hyperplanes, versions of the Farkas’ lemma and supporting hyperplanes. Following Berger [1], various versions of the separation of open or closed convex subsets by hyperplanes are proved as consequences of a geometric version of the Hahn-Banach theorem (Theorem7.1).WealsoshowhowvariousversionsoftheFarkaslemma(Lemma 7.3, 7.4, and 7.5) can be easily deduced from separation results (Corollary 7.4 andProposition7.3).Farkaslemmaplaysanimportantresultinlinearprogram- ming.Indeed,itcanbeusedtogiveaquickproofofso-calledstrongdualityin linearprogramming.Wealsoprovetheexistenceofsupportinghyperplanesfor boundary points of closed convex sets (Minkowski’s lemma, Proposition 7.4). Unfortunately,lackofspacepreventsustodiscusspolytopesandpolyhedra.The readerwillfindamasterlyexpositionofthesetopicsinZiegler[3]. (cid:15) Chapter14isamajorrevisionofChapter13(ApplicationsofEuclideanGeome- trytoVariousOptimizationProblems)fromthefirsteditionofthisbookandhas been renamed “Applications of SVD and Pseudo-Inverses.” Section 14.1 about least squares problems and the pseudo-inverse has not changed much but we haveaddedthefactthatAA+istheorthogonalprojectionontotherangeofAand thatA+AistheorthogonalprojectionontoKer(A)?,theorthogonalcomplement of Ker(A).Wehavealso added Proposition14.1 whichshowshowthepseudo- inverseofanormalmatrix,A,canbeobtainedfromablockdiagonalizationofA (seeTheorem12.7).Sections14.2,14.3and14.4arenew. InSection14.2,wedefinevariousmatrixnorms,includingoperatornorms,and weproveProposition14.4showinghowamatrixcanbebestapproximatedbya rankkmatrix(inthe∥∥ norm). 2 Section14.3isdevotedtoPrincipalComponentAnalysis(PCA).PCAisavery important statistical tool yet, in our experience, most presentations of this con- cept lack a crisp definition. Most presentations identify the notion of principal components with the result of applying SVD and do not prove why SVD does yieldtheprincipalcomponentsanddirections.Torectifythissituation,wegivea precisedefinitionofPCA’s(Definition14.3)andweproverigorouslyhowSVD yieldsPCA(Theorem14.3),usingtheRayleigh-Ritzratio(Lemma14.2). InSection14.4,itisshownhowtobestapproximateasetofdatawithanaffine subspaceintheleastsquaressense.Again,SVDcanusedtofindsolutions. (cid:15) Chapter15isnew,exceptforSection15.1whichreproducesSection13.2from thefirsteditionofthisbook.Weaddedthedefinitionofthepositivesemidefinite coneordering,⪰,onsymmetricmatrices,sinceitisextensivelyusedinconvex optimization. InSection15.2,wefindanecessaryandsufficientcondition(Proposition15.2) forthequadraticfunction, f(x)= 1x⊤Ax+x⊤b,tohaveaminimumintermsof 2 thepseudo-inverseofA(whereAisasymmetricmatrix).Wealsoshowhowto accomodate linear constraints of the formC⊤x=0 or affine constraints of the formC⊤x=t (wheret̸=0). In Section 15.3, we consider the problem of maximizing f(x)=x⊤Ax on the unit sphere, x⊤x =1 or, more generally, on the ellipsoid x⊤Bx =1, where A isasymmetricmatrixandBissymmetric,positivedefinite.Weshowthatthese Preface ix problemsarecompletelysolvedbydiagonalizingAwithrespecttoanorthogonal matrix.Wealsobrieflyconsidertheeffectofaddinglinearconstraintsoftheform C⊤x=0oraffineconstraintsoftheformC⊤x=t (wheret̸=0). (cid:15) Chapter16isnew.Inthischapter,wedefinethenotionofSchurcomplementand weuseittocharacterizewhenasymmetric2(cid:2)2blockmatrixiseitherpositive semidefinite or positive definite (Proposition 16.1, Proposition 16.2, and Theo- rem16.1). (cid:15) Chapter 17 is also brand new. In this chapter, we show how a computer vision problem,contourgrouping,canbeformulatedasaquadraticoptimizationprob- leminvolvinganHermitianmatrix.Becauseoftheextradependencyonanan- gle,thisoptimizationproblemleadstofindingthederivativeofeigenvaluesand eigenvectorsofanormalmatrix,X.Wederiveexplicitformulaeforthesederiva- tives(inthecaseofeigenvectors,theformulainvolvesthepseudo-inverseofX) andweprovetheircorrectness.Itappearstobedifficulttofindtheseformulaeto- getherwithacleanandcorrectproof,intheliterature.Ouroptimizationproblem leads naturally to the consideration of the field of values (or numerical range), F(A),ofacomplexmatrix,A.Aremarkablepropertyofthefieldofvaluesisthat it is a convex subset of the plane, a theorem due to Toeplitz and Hausdorff, for whichwegiveashortproofusingadeformationargument(Theorem17.1).Prop- ertiesofthefieldsofvaluescanbeexploitedtosolveouroptimizationproblem. Thischapterdescribescurrentandexcitingresearchincomputervision. (cid:15) Chapter18(whichusedtobeChapter14inthefirstedition)hasbeenslightlyex- pandedandimproved.Ourexperiencewhileteachingthematerialofthischapter, anintroductiontomanifoldsandLiegroups,isthatitishelpfultoreviewcare- fully the notion of the derivative of a function, f: E !F, where E and F are normed vector spaces. Thus, we added Section 18.7 which provides such a re- view.Wealsostatetheinversefunctiontheoremanddefineimmersionsandsub- mersions. Section 18.8 has also been slightly expanded. We added Proposition 18.6andTheorem18.7whichareoftenusefulinprovingthatvariousspacesare manifolds,wedefinedcriticalandregularvalues,definedMorsefunctions,and we made a few cosmetic improvements in the paragraphs following Definition 18.20.Anumberofnewproblemsonmanifoldshavebeenadded. (cid:15) TheonlychangetoChapter19(Chapter15inthefirstedition)istheinclusionof amorecompletetreatmentoftheFrenetframe,fornDcurvesinSection19.10. (cid:15) Similarly, the only change to Chapter 20 (Chapter 16 in the first edition) is the additionofSection20.12oncovariantderivativesandtheparalleltransport. Besides adding problems to all the Chapters listed above we added one more problemtoChapter2. Asinthefirstedition,thereissomeadditionalmaterialonthewebsite http://www.cis.upenn.edu/ejean/gbooks/geom2.html. Thismaterialhasnotchangedandthechapterandsectionnumbersarethoseof thefirstedition.AgraphshowingthedependenciesofchaptersisshowninFigure 0.1. x Preface 1 21 b b 2 b 3 6 4 b b b 7 8 10 19 5 b b b b b 9 12 20 b b b b 11 13 b 14 b b 18 15 b 16 17 b b Fig.0.1 Dependencyofchapters Acknowledgments Since the publication of the first edition of this book, I received valuable com- mentsfromKostasDaniilidis,MarceloSiqueira,JianboShi,BenTaskar,CJTaylor, MickeyBrautbar,KaterinaFragiadaki,RyanKennedy,OlegNaroditsky,andWeiyu Zhang. Preface xi References 1. Marcel Berger. Ge´ome´trie 2. Nathan, 1990. English edition: Geometry 2, Universitext, Springer-Verlag. 2. StephenBoydandLievenVandenberghe.ConvexOptimization.CambridgeUniversityPress, firstedition,2004. 3. GunterZiegler. LecturesonPolytopes. GTMNo.152.SpringerVerlag,firstedition,1997. Philadelphia,February2011 JeanGallier

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.