ebook img

Low-Rank and Sparse Modeling for Visual Analysis PDF

240 Pages·2014·7.058 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Low-Rank and Sparse Modeling for Visual Analysis

Yun Fu Editor Low-Rank and Sparse Modeling for Visual Analysis Low-Rank and Sparse Modeling for Visual Analysis Yun Fu Editor Low-Rank and Sparse Modeling for Visual Analysis 123 Editor YunFu Northeastern University Boston,MA USA ISBN 978-3-319-11999-1 ISBN 978-3-319-12000-3 (eBook) DOI 10.1007/978-3-319-12000-3 LibraryofCongressControlNumber:2014951660 SpringerChamHeidelbergNewYorkDordrechtLondon ©SpringerInternationalPublishingSwitzerland2014 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpartof the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,broadcasting,reproductiononmicrofilmsorinanyotherphysicalway,andtransmissionor informationstorageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilar methodology now known or hereafter developed. Exempted from this legal reservation are brief excerpts in connection with reviews or scholarly analysis or material supplied specifically for the purposeofbeingenteredandexecutedonacomputersystem,forexclusiveusebythepurchaserofthe work. Duplication of this publication or parts thereof is permitted only under the provisions of theCopyrightLawofthePublisher’slocation,initscurrentversion,andpermissionforusemustalways beobtainedfromSpringer.PermissionsforusemaybeobtainedthroughRightsLinkattheCopyright ClearanceCenter.ViolationsareliabletoprosecutionundertherespectiveCopyrightLaw. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexempt fromtherelevantprotectivelawsandregulationsandthereforefreeforgeneraluse. While the advice and information in this book are believed to be true and accurate at the date of publication,neithertheauthorsnortheeditorsnorthepublishercanacceptanylegalresponsibilityfor anyerrorsoromissionsthatmaybemade.Thepublishermakesnowarranty,expressorimplied,with respecttothematerialcontainedherein. Printedonacid-freepaper SpringerispartofSpringerScience+BusinessMedia(www.springer.com) Preface Visualanalysishasbeenprevalentlatelysince large-scaledataaregeneratedevery dayfrom advance vision andimaging devices.Low-rankandsparsemodellingare emerging mathematical tools dealing with uncertainties of real-world visual data. This book provides a unique view of low-rank and sparse computing, especially approximation, recovery, representation, scaling, coding, embedding, and learning among unconstrained visual data. These techniques will significantly advance existing methodologies of image and video analysis and understanding by taking advantage of low-rank and sparse modeling. Visual analysis under uncertainty stands at the core of numerous real-world applications which bring broad impacts andgeneratesignificantacademicandindustrialvalues.Asaprofessionalreference book and research monograph, this book, through several chapters covers popular research topics from several fields such as Pattern Recognition, Computer Vision, Big Data, Social Media, Image and Video Classification, and Machine Learning. These chapters, contributed by the editor, top experts, and practitioners, comple- menteachotherfromvariousperspectiveandcomposeasolidoverviewofthelow- rank and sparse modelling techniques for visual analysis. Readers from different backgrounds may all benefit from the well-balanced contents for both theoretical analysis and real-world applications. The book is composed of ten chapters in a coherent manner. “Nonlinearly StructuredLow-RankApproximation”presentsanAdjustedLeast-SquaresEstimation method for polynomially structured low-rank approximation, which is computation- allycheapandstatisticallyconsistent;“LatentLow-RankRepresentation”presentsthe formulationoftheLatentLow-RankRepresentation(LatLRR),whichconstructsthe dictionaryusingbothobservedandunobserved,hiddendata,andseamlesslyintegrates subspace clustering and feature extraction into a unified framework; “Scalable Low-Rank Representation” addresses problem of solving nuclear norm regularized optimization problems for Low-Rank Representation under large-scale settings throughatransformation,whichisachievedbyfactoringthelarge-sizesolutionmatrix into the product of a small-size orthonormal matrix (active subspace) and another small-sizematrix;“Low-RankandSparseDictionaryLearning”introduceslow-rank andsparsedictionarylearningmethods,which learn discriminativedictionarieswith v vi Preface low-rank and sparse constraints for modeling; “Low-Rank Transfer Learning” discussestransferlearninginageneralizedsubspacewhereeachtargetsamplecanbe represented by some combination of source samples under a low-rank constraint; “Sparse Manifold Subspace Learning” presents a linear dimensionality reduction algorithm called Sparse Manifold Subspace Learning, based on sparse eigendecom- position, by considering the locality of samples and their neighbors; “Low Rank TensorManifoldLearning” presentsa supervisedlowranktensormanifoldlearning modeltolearntheintrinsicstructureanddimensionalityofthetensorsembeddedina high-dimensional Euclidean space; “Low-Rank and Sparse Multi-task Learning” proposes to correlate multiple tasks using a low-rank representation and formulate multi-tasklearningapproachesasmathematicaloptimizationproblemsofminimizing theempiricallossregularizedbythelow-rankstructureandaseparatesparsestructure; “Low-RankOutlierDetection”presentsalow-rankoutlierdetectionapproach,which incorporates a low-rank constraint into the support vector data description model; “Low-Rank Online Metric Learning” presents an online metric learning model considering the low-rank constraint to address the online image classification/scene recognitionproblemviaadaptivesimilaritymeasurement. This book aims at broad groups of audience, such as professional researchers, graduatestudents, university faculties.Specifically,this bookcanbe usedbythese audiences in the background of computer science/engineering, statistics, and mathematics.Otherpotentialaudiencescanbeattractedfrombroadfieldsofscience andengineeringsincethistopicisinterdisciplinaryandthetopicscoveredsynergize cross-domain knowledge. I would like to sincerely thank all the contributors of this book for presenting their most recent research advances in an easily accessible manner. I would also sincerelythankeditorsBrettKurzman,RebeccaR.Hytowitz,andMaryJamesfrom Springer for support to this book project. Boston, MA Yun Fu Contents Nonlinearly Structured Low-Rank Approximation . . . . . . . . . . . . . . . 1 Ivan Markovsky and Konstantin Usevich Latent Low-Rank Representation. . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Guangcan Liu and Shuicheng Yan Scalable Low-Rank Representation. . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Guangcan Liu and Shuicheng Yan Low-Rank and Sparse Dictionary Learning . . . . . . . . . . . . . . . . . . . . 61 Sheng Li, Liangyue Li and Yun Fu Low-Rank Transfer Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 Ming Shao, Dmitry Kit and Yun Fu Sparse Manifold Subspace Learning. . . . . . . . . . . . . . . . . . . . . . . . . . 117 Ming Shao, Mingbo Ma and Yun Fu Low Rank Tensor Manifold Learning . . . . . . . . . . . . . . . . . . . . . . . . 133 Guoqiang Zhong and Mohamed Cheriet Low-Rank and Sparse Multi-task Learning . . . . . . . . . . . . . . . . . . . . 151 Jianhui Chen, Jiayu Zhou and Jieping Ye Low-Rank Outlier Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181 Sheng Li, Ming Shao and Yun Fu Low-Rank Online Metric Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . 203 Yang Cong, Ji Liu, Junsong Yuan and Jiebo Luo Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235 vii Nonlinearly Structured Low-Rank Approximation IvanMarkovskyandKonstantinUsevich Abstract Polynomiallystructuredlow-rankapproximationproblemsoccurin • algebraiccurvefitting,e.g.,conicsectionfitting, • subspaceclustering(generalizedprincipalcomponentanalysis),and • nonlinearandparameter-varyingsystemidentification. The maximum likelihood estimation principle applied to these nonlinear models leads to nonconvex optimization problems and yields inconsistent estimators in theerrors-in-variables(measurementerrors)setting.Weproposeacomputationally cheap and statistically consistent estimator based on a bias correction procedure, calledAdjustedLeast-SquaresEstimation.Themethodissuccessfullyusedforconic sectionfittingandwasrecentlygeneralizedtoalgebraiccurvefitting.Thecontribu- tionofthisbook’schapteristheapplicationofthepolynomiallystructuredlow-rank approximationproblemand,inparticular,theadjustedleast-squaresmethodtosub- space clustering, nonlinear and parameter-varying system identification. The clas- sicalinsystemidentificationinput-outputnotionofadynamicalmodelisreplaced by the behavioral definition of a model as a set, represented by implicit nonlinear differenceequations. · · Keywords Structured low-rank approximation Conic section fitting Subspace · clustering Nonlinearsystemidentification B I.Markovsky( )·K.Usevich DepartmentELEC,VrijeUniversiteitBrussel, Pleinlaan2,BuildingK,B-1050Brussels,Belgium e-mail:[email protected] K.Usevich e-mail:[email protected] ©SpringerInternationalPublishingSwitzerland2014 1 Y.Fu(ed.),Low-RankandSparseModelingforVisualAnalysis, DOI10.1007/978-3-319-12000-3_1 2 I.MarkovskyandK.Usevich 1 Introduction Datamodeling,missingdataestimation,anddimensionalityreductionproblemsare closely related to the problem of approximating a given matrix by another matrix of reduced rank. Apart from the approximation criterion and the desired rank, the low-rankapproximationprobleminvolvesadditionalconstraintsthatrepresentprior knowledgeabouttheto-be-estimated“true”datageneratingsystem.Commonexam- ples are non-negativity and structure (e.g., Hankel, Toeplitz, and Sylvester) of the approximationmatrix. The reduced rank of the approximation matrix corresponds to the reduction of dimensionalityaswellastothereductionofthemodelcomplexityindatamodel- ing.Inlineartime-invariantsystemidentification,forexample,therankofthedata matrixisrelatedtotheorderofthemodel.BytheEckart-Young-Mirskytheorem[6], unstructuredoptimalinspectralandFrobeniusnormreducedrankapproximationis obtainedfromthetruncatedsingularvaluedecompositionofthematrix.Withafew exceptions,thisresulthasnotbeengeneralizedtostructuredapproximationproblems andweightedapproximationcriteria.Forstructuredweightedapproximationprob- lemsconvexrelaxationsaswellaslocaloptimizationmethodshavebeendeveloped, see[18]. Inthisbook’schapter,weconsiderthelow-rankapproximationproblemwiththe constraint that the rank deficient matrix is polynomially structured. Formally, the polynomiallystructuredlow-rankapproximationproblemisdefinedasfollows. Given a data matrix D, an approximation criterion (cid:2)D − D(cid:2)(cid:2), a polynomial mappingΦ : D(cid:2)(cid:3)→ D(cid:2) ,andanupperboundontherankr, ext minimize overD(cid:2) (cid:2)D−D(cid:2)(cid:2) (cid:3) (cid:4) (1) subjectto rank Φ(D(cid:2)) ≤r. The polynomially structured low-rank approximation problem (1) has applica- tionsin • curvefitting[18,Chap.6], • manifoldlearning[16, 31], • subspaceclustering[28],and • nonlinearsystemidentification[26, 27]. The simplest special case of nonlinear curve fitting is conic section fitting, which leads to low-rank approximation with quadratic structure constraint, see Sects.2.1 and 3. More involved is the application to subspace clustering, which is low-rank approximationwithVeronesestructureoftheapproximationandanadditional(fac- torizability)conditiononthekernel. NonlinearlyStructuredLow-RankApproximation 3 Asanoptimizationproblem,(1)isnonconvex.Contrarytoaffinestructuredlow- rankapproximationproblems(see[17, 19]foranoverviewofrecentresultsonthis (cid:2) problem),(1)doesnotallowtheapproximationmatrix D tobeeliminatedanalyti- callyviathevariableprojectionsmethod[9].Therefore,thenumberofoptimization variablesisoftheorderofmagnitudeofthenumberofdatapoints.Thismakesthe useoflocaloptimizationmethodsinfeasibleformediumtolargescalepolynomially structuredlow-rankapproximationproblems. Theprincipalcomponentanalysismethod[11, 12]iscloselyrelatedtolow-rank approximation. Principal component analysis gives a stochastic interpretation of thedeterministiclow-rankapproximation.Viceverse,low-rankapproximationisa deterministicoptimizationproblemresultingfromtheprincipalcomponentanalysis method.Nonlinearlystructuredlow-rankapproximationproblemsareconsideredin theprincipalcomponentanalysiscontextunderthenamesofprincipalcurves[10] and kernel principal component analysis [1, Chap.12], [23]. The kernel principal component analysis method is unstructured low-rank approximation of the matrix Φ(D),i.e.,itdoesnotimposethepolynomialstructureoftheapproximatingmatrix. Weadopttheerrors-in-variablesstochasticmodel,i.e.,thegivendataisobtained fromtruedatathatsatisfiesatruedatageneratingmodelplusadditivenoise,see[3]. Thefollowingnoisepropertiesareassumedthroughoutthechapter: • zeromean, • independentsamples, • Gaussiandistribution, • knowncovariancematrixuptoascalingfactor. Thesolutionofthepolynomiallystructuredlow-rankapproximationproblem(1)is amaximumlikelihoodestimatorintheerrors-in-variablesetting.Itiswellknown, see, e.g., [15], that the maximum likelihood estimator is inconsistent in nonlinear errors-invariablesestimationproblems. The method proposed in this book’s chapter is a generalization of the adjusted least squares method of [14, 20] developed for ellipsoid fitting. The adjustment procedure ismotivated fromtheidea ofcorrecting forthebias oftheunstructured low-rank approximation method. The bias correction is explicitly given in terms of the noise variance and a procedure for the estimation of the noise variance is proposed.Ageneralizationoftheadjustedleastsquaresmethodtoalgebraiccurve fittingisdescribedin[18,Chap.6].Inthiscontribution,weshowthatpolynomially structuredlow-rankapproximationproblemsappearnaturallyinsubspaceclustering andnonlinearsystemidentification,sothattheadjustedleastsquaresalgorithmisa promisingestimationmethodalsointheseapplicationareas. 1.1 Outline In Sect.2, we start with an overview of the application of polynomially structured low-rankapproximationtoconicsectionfitting,subspaceclustering,andnonlinear

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.