ebook img

Sparse optimization theory and methods PDF

297 Pages·2018·2.455 MB·English
by  ZhaoYun-Bin
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Sparse optimization theory and methods

Sparse Optimization Theory and Methods Yun-Bin Zhao School of Mathematics University of Birmingham, Edgbaston, UK p, p, A SCIENCE PUBLISHERS BOOK A SCIENCE PUBLISHERS BOOK CRC Press Taylor & Francis Group 6000 Broken Sound Parkway NW, Suite 300 Boca Raton, FL 33487-2742 © 2018 by Taylor & Francis Group, LLC CRC Press is an imprint of Taylor & Francis Group, an Informa business No claim to original U.S. Government works Printed on acid-free paper Version Date: 20180514 International Standard Book Number-13: (cid:1)9(cid:1)(cid:1)7(cid:1)(cid:1)8(cid:1)(cid:1)(cid:1)-(cid:1)1-138-08094-2(cid:1)(cid:1)(Hardback) This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use. The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint. Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, includ- ing photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers. For permission to photocopy or use material electronically from this work, please access www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged. Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com and the CRC Press Web site at http://www.crcpress.com To my daughter Jessie and my wife Lily for their constant support, and to the memory of my mother Suzhi Liu Preface The theory and algorithms for many practical problems, ranging from signal and image processing, compressed sensing, statistical regression and computer vision to machine learning, have witnessed significant new developments under the following hypothesis: The unknown data to process is sparse or can be sparsely approximated. Under this fundamental hypothesis, many problems can be formulated as the sparse optimization problem which seeks the sparse solution of an underdetermined linear system or the sparse point in a convex set. Sparsity has long been exploited in signal and image processing as well as in statistics and learning communities. However, the rapid development in these fields by exploiting sparsity started only around 2004. Nowadays seeking sparsity has become a common request in many scientific areas. Lying at the crossroads between applied mathematics, operations research, electrical engineering, computer science and information science, sparse optimization has attracted considerable cross- disciplinary attention and has stimulated a plethora of new applications of sparsity in numerous fields, such as geophysical data analysis, medical imaging processing, communications, sensor network, and computational biology, to name a few. The aim of this book is to introduce the theory and algorithms for typical sparse optimization problems arising in compressed sensing and Big Data processing. Sparse optimization is a fast-moving area and a much researched topic. It is almost impossible for a single monograph to cover every aspect and every piece of work in this rapid growing area. This book is an introduction to the sparse signal recovery from a mathematical optimization perspective. Although compressed sensing was developed with a number of mathematical tools, including linear algebra, optimization, approximation, random matrix, probability, functional analysis, harmonic analysis and graph theory, the only prior knowledge required for this book is basic linear algebra, convex analysis and optimization. The reader could be a researcher, graduate or postgraduate student in the areas of operations research, applied mathematics, numerical analysis, computer science, data science, signal and image processing, and machine learning. vi < Sparse Optimization Theory and Methods This book contains eight chapters, and each chapter covers a relatively independent topic. However, the classic optimality conditions, including duality and complementary slackness property of convex optimization, are used as the common tool in this book to carry out both theoretical and algorithmic developments for sparse optimization problems. Yun-Bin Zhao Birmingham, United Kingdom Contents Dedication iii Preface v 1 Uniqueness of the Sparsest Solution of Linear Systems 1 1.1 Introduction 1 1.2 Spark 2 1.3 Uniqueness via Mutual Coherence 5 1.4 Improved Uniqueness Criteria via Coherence Rank 10 1.4.1 Sub-Mutual Coherence and Coherence Rank 10 1.4.2 Improved Lower Bounds of Spark(A) 12 1.4.3 Improved Coherence Conditions 17 1.5 Babel Function and Sub-Babel Function 18 1.6 Notes 23 2 Uniqueness of Solutions to l-Minimization Problems 25 1 2.1 Strict Complementary Slackness Property (SCSP) 26 2.2 Least l-Norm Solution 28 1 2.2.1 Preliminary 28 2.2.2 Necessary Condition (I): Range Space Property of AT 31 2.2.3 Necessary Condition (II): Full-Column-Rank Property 34 2.2.4 Sufficient Condition 36 2.2.5 Uniqueness Characterization 40 2.3 Least l-Norm Non-negative Solution 41 1 2.4 Least l-Norm Points in Polyhedra 46 1 2.4.1 Restricted Range Space Property of AT 47 2.4.2 Proof of Necessity 48 2.4.3 Proof of Sufficiency 54 2.5 Notes 57 viii < Sparse Optimization Theory and Methods 3 Equivalence of l- and l-Minimization 59 0 1 3.1 Equivalence and Strong Equivalence 59 3.2 Standard l- and l-Minimization Problems 61 0 1 3.3 Problems with Non-negative Constraints 64 3.4 Application to Linear Programming 68 3.5 Equivalence of l-Problem and Weighted l-Problem 69 0 1 3.6 Sparse Vector Recovery 75 3.6.1 Uniform Recovery: RSP-Based Analysis 79 3.6.2 Beyond Uniform Recovery 83 3.7 Sparse Non-negative Vector Recovery 85 3.7.1 Uniform Recovery: RSP+-Based Analysis 85 3.7.2 Non-uniform Recovery of Non-negative Vectors 88 3.8 Notes 91 4 1-Bit Compressed Sensing 93 4.1 Introduction 93 4.2 Sign Measurements and Recovery Criteria 95 4.3 Relaxation Models 97 4.4 Consistency Condition 98 4.4.1 Nonstandard Sign Function 99 4.4.2 Standard Sign Function 101 4.5 Reformulation of 1-Bit Compressed Sensing 103 4.6 Non-uniform Sign Recovery 105 4.7 Uniform Sign Recovery 115 4.8 Notes 119 5 Stability of Linear Sparse Optimization Methods 121 5.1 Introduction 122 5.2 Hoffman’s Error Bound for Linear Systems 124 5.3 Weak RSP of Order k of AT 127 5.4 Stability of Standard l-Minimization 128 1 5.5 Linear Dantzig Selector 134 5.6 Special Cases 140 5.6.1 Standard Dantzig Selector 140 5.6.2 Weighted Dantzig Selector 141 5.6.3 l-Minimization with l -Norm Constraints 142 1 ∞ 5.6.4 l-Minimization with l-Norm Constraints 144 1 1 5.7 Notes 146 Contents n ix 6 Stability of Nonlinear Sparse Optimization Methods 149 6.1 Introduction 150 6.2 Orthogonal Projection Operator 152 6.3 Polytope Approximation of Unit Balls 155 6.3.1 l-Ball 155 2 6.3.2 General Unit Ball 157 6.4 A Necessary Condition for Stability 159 6.5 l-Minimization with l-Norm Constraints 162 1 2 6.6 Nonlinear Dantzig Selector 171 6.7 The LASSO Problem 179 6.8 Summary 189 6.9 Notes 189 7 Reweighted l-Algorithms 193 1 7.1 Merit Function for Sparsity 194 7.1.1 M(1)-Class Merit Functions 196 7.1.2 M(2)-Class Merit Functions 200 7.2 Reweighted l-Methods 204 1 7.2.1 Examples of Weights 207 7.3 Numerical Experiments 209 7.4 Theoretical Analysis 212 7.4.1 Well-Definedness of Algorithms 213 7.4.2 Convergence to Sparse Points 216 7.5 Summary 226 7.6 Notes 226 8 Sparsity via Dual Density 229 8.1 Introduction 229 8.2 l-Minimization with Non-negative Constraints 230 0 8.2.1 Optimal Weight via Dual Density 231 8.2.2 Dual-Density-Based Reweighted l-Algorithm 235 1 8.2.3 Numerical Experiments 238 8.2.4 Theoretical Performance 241 8.3 DDRW for Standard l-Minimization 249 0 8.4 Sparsity Enhancement for Weighted l-Minimizers 256 1 8.5 Notes 259 References 261 Index 281

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.