ebook img

Introductory Lectures on Convex Optimization: A Basic Course PDF

253 Pages·2004·15.901 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Introductory Lectures on Convex Optimization: A Basic Course

INTRODUCTORY LECTURES ON CONVEX OPTIMIZATION A Basic Course Applied Optimization Volume 87 Series Editors: Panos M. Pardalos University ofFlorida, US.A. Donald W. Heam University ofFlorida, US.A. INTRODUCTORY LECTURES ON CONVEX OPTIMIZATION A Basic Course By Yurii Nesterov Center of Operations Research and Econometrics, (CORE) Universite Catholique de Louvain (UCL) Louvain-la-Neuve, Belgium '~·' Springer Science+Business Media, LLC Library of Congress Cataloging-in-PubHcation Nesterov, Yurri Introductory Lectures on Convex Optimization: A Basic Course ISBN 978-1-4613-4691-3 ISBN 978-1-4419-8853-9 (eBook) DOI 10.1007/978-1-4419-8853-9 Copyright Cl 2004 by Springer Science+Business Media New York Originally published by Kluwer Academic Publishers in 2004 Softcover reprint of the hardcover 1st edition 2004 All rights reserved. No part ofthis publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photo-copying, microfilming, recording, or otherwise, without the prior written perrnission ofthe publisher, with the exception ofa ny material supplied specifrcally for the purpose ofbeing entered and executed on a computer system, for exclusive use by the purchaser ofthe work. Permissions for books published in the USA: permj ssi ons®wkap com Permissions for books published in Europe: [email protected] Printedon acid-free paper. Contents Preface lX Acknowledgments xiii Introduction XV 1. NONLINEAR OPTIMIZATION 1 1.1 World of nonlinear optimization 1 1.1.1 General formulation of the problern 1 1.1.2 Performance of numerical methods 4 1.1.3 Complexity bounds for global optimization 7 1.1.4 Identity cards of the fields 13 1.2 Local methods in unconstrained minimization 15 1.2.1 Relaxation and approximation 15 1.2.2 Classes of differentiable functions 20 1.2.3 Gradient method 25 1.2.4 Newton method 32 1.3 First-order methods in nonlinear optimization 37 1.3.1 Gradient method and Newton method: What is different? 37 1.3.2 Conjugate gradients 42 1.3.3 Constrained minimization 46 2. SMOOTH CONVEX OPTIMIZATION 51 2.1 Minimization of smooth functions 51 2.1.1 Smooth convex functions 51 :FJ:• 2.1.2 Lower complexity bounds for 1(Rn) 58 2.1.3 Strongly convex functions 63 2.1.4 Lower complexity bounds for s:,J}(Rn) 66 vi INTRODUCTORY LEGTURES ON CONVEX OPTIMIZATION 2.1.5 Gradient method 68 2.2 Optimal Methods 71 2.2.1 Optimal methods 71 2.2.2 Convex sets 81 2.2.3 Gradient mapping 86 2.2.4 Minimization methods for simple sets 87 2.3 Minimization problern with smooth components 90 2.3.1 Minimax problern 90 2.3.2 Gradient mapping 93 2.3.3 Minimization methods for minimaxproblern 96 2.3.4 Optimization with functional constraints 100 2.3.5 Method for constrained minimization 105 3. NONSMOOTH CONVEX OPTIMIZATION 111 3.1 General convex functions 111 3.1.1 Motivation and definitions 111 3.1.2 Operations with convex functions 117 3.1.3 Continuity and differentiability 121 3.1.4 Separation theorems 124 3.1.5 Subgradients 126 3.1.6 Computing subgradients 130 3.2 No nsmooth minimization methods 135 3.2.1 Generallower complexity bounds 135 3.2.2 Main lemma 138 3.2.3 Subgradient method 141 3.2.4 Minimization with functional constraints 144 3.2.5 Complexity bounds in finite dimension 146 3.2.6 Cutting plane schemes 149 3.3 Methods with complete data 156 3.3.1 Model of nonsmooth function 157 3.3.2 Kelley method 158 3.3.3 Level method 160 3.3.4 Constrained minimization 164 4. STRUCTURAL OPTIMIZATION 171 4.1 Self-concordant functions 171 4.1.1 Black box concept in convex optimization 171 4.1.2 What the Newton method actually does? 173 4.1.3 Definition of self-concordant function 175 Contents vii 4.1.4 Main inequalities 181 4.1.5 Minimizing the self-concordant function 187 4.2 Self-concordant barriers 192 4.2.1 Motivation 192 4.2.2 Definition of self-concordant barriers 193 4.2.3 Main inequalities 196 4.2.4 Path-following scheme 199 4.2.5 Finding the analytic center 203 4.2.6 Problems with functional constraints 206 4.3 Applications of structural optimization 210 4.3.1 Bounds on parameters of self-concordant barriers 210 4.3.2 Linear and quadratic optimization 213 4.3.3 Semidefinite optimization 216 4.3.4 Extremal ellipsoids 220 4.3.5 Separable optimization 224 4.3.6 Choice of minimization scheme 227 Bibliography 231 References 233 Index 235 Preface It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op timization problems, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop ing field, which got the name "polynomial-time interior-point methods", such a justification was obligatory. Afteralmost fifteen years of intensive research, the main results of this development started to appear in monographs [12, 14, 16, 17, 18, 19]. Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually, this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12]. Moreover, it was clear that the new theory of interior-point methods represented only a part of a general theory of convex optimization, a rather involved field with the complexity bounds, optimal methods, etc. The majority of the latter results were published in different journals in Russian. The book you see now is a result of an attempt to present serious thingsinan elementary form. As is always the case with a one-semester course, the most difficult problern is the selection of the material. For X INTRODUCTORY LEGTURES ON CONVEX OPTIMIZATION us the target notions were the complexity of the optimization problems and a provable effi.ciency of numerical schemes supported by complex ity bounds. In view of a severe volume Iimitation, we had to be very pragmatic. Any concept or fact included in the book is absolutely nec essary for the analysis of at least one optimization scheme. Surprisingly enough, none of the material presented requires any facts from duality theory. Thus, this topic is completely omitted. This does not mean, of course, that the author neglects this fundamental concept. However, we hope that for the first treatment of the subject such a compromise is acceptable. The main goal of this course is the development of a correct under standing of the complexity of different optimization problems. This goal was not chosen by chance. Every year I meet Ph.D. students of different specializations who ask me for advice on reasonable numerical schemes for their optimization models. And very often they seem to have come too late. In my experience, if an optimization model is created without taking into account the abilities of numerical schemes, the chances that it will be possible to find an acceptable numerical solution are close to zero. In any field of human activity, if we create something, we know in advance why we are doing so and what we are going to do with the result. And only in numerical modeHing is the situationstill different. This coursewas given during several years at Universite Catholique de Louvain (Louvain-la-Neuve, Belgium). The course is self-contained. It consists of four chapters (Nonlinear optimization, Smooth convex opti mization, Nonsmooth convex optimization and Structural optimization (Interior-point methods)). The chapters are essentially independent and can be used as parts of more general courses on convex analysis or op timization. In our experience each chapter can be covered in three two hour lectures. We assume a reader to have a standard undergraduate background in analysis and linear algebra. We provide the reader with short bibliographical notes which should help in a closer examination of the subject. YURII NESTEROV Louvain-la-Neuve, Belgium May, 2003.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.