ebook img

Introduction to the Theory of Nonlinear Optimization PDF

249 Pages·1994·7.209 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Introduction to the Theory of Nonlinear Optimization

Johannes Jahn Introduction to the Theory of Nonlinear Optimization With 30 Figures Springer-Verlag Berlin Heidelberg GmbH Prof. Dr. Johannes Jahn Universităt Erlangen-Ni.imberg Institut fUr Angewandte Mathematik MartensstraBe 3 D-91058 Erlangen ISBN 978-3-662-02987-9 ISBN 978-3-662-02985-5 (eBook) DOI 10.1007/978-3-662-02985-5 This work is subject to copyright. AH rights are reserved, whether the whole or part ofthe material is concerned, specifically the rights oftranslation, reprinting, reuse of il lustrations,recitation, broadcasting, reproduction on microfilm orin anyotherway,and storage in data banks. Duplication ofthis publication or parts thereofis permitted only under the provisions ofthe German Copyright Law ofSeptember9, 1965, in its current version, and permission for use must always be obtained from Springer-Verlag Berlin Heidelberg GmbH. Viola-tions are liable for prosecution under the German Copyright Law. © Springer-Verlag Berlin Heidelberg 1994 Originally published by Springer-Verlag Berlin Heidelberg New York Tokyo in 1994 Softcover reprint of the hardcover 1s t edition 1994 The use ofregistered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant pro tective laws and regulations and therefore free for general use. 42/2202-5 4 3 2 1 0-Printed on acid-free paper Preface This book presents an application-oriented introduction to the the ory of nonlinear optimization. It describes basic notions and concep tions of optimization in the setting of normed or even Banach spaces. Various theorems are applied to problems in related mathematical areas. For instance, the Euler-Lagrange equation in the calculus of variations, the generalized Kolmogorov condition and the alternation theorem in approximation theory as well as the Pontryagin maximum principle in optimal control theory are derived from general results of optimization. Because of the introductory character of this text it is not intended to give a complete description of all approaches in optimization. For instance, investigations on conjugate duality, sensitivity, stability, re cession cones and other concepts are not included in the book. The bibliography gives a survey of books in the area of nonlinear optimization and related areas like approximation theory and optimal control theory. Important papers are cited as footnotes in the text. I am grateful to S. GeuB, S. Gmeiner, S. Keck, Prof. Dr. E.W. Sachs and H. Winkler for their support, and I am especially in debted to D.G. Cunningham, Dr. F. Hettlich, Dr. J. Klose, Prof. Dr. E.W. Sachs and Dr. T. Staib for fruitful discussions. Johannes Jahn Contents Preface v 1 Introduction and Problem Formulation 1 2 Existence Theorems for Minimal Points 7 2.1 Problem Formulation . 7 2.2 Existence Theorems . . . . . . . . . . . . 8 2.3 Set of Minimal Points .......... . 18 2.4 Application to Approximation Problems 19 2.5 Application to Optimal Control Problems 24 Exercises .......... . 30 3 Generalized Derivatives 33 3.1 Directional Derivative ..... . 33 3.2 Gateaux and Frechet Derivatives 39 3.3 Subdifferential . . 51 3.4 Quasidifferential . 59 3.5 Clarke Derivative 69 Exercises ..... 77 4 Tangent Cones 81 4.1 Definition and Properties . 81 4.2 Optimality Conditions 91 4.3 A Lyusternik Theorem 98 Exercises .......... . . 106 5 Generalized Lagrange Multiplier Rule 109 5.1 Problem Formulation . . . . . . . . . . . 109 Vlll Contents 5.2 Necessary Optimality Conditions . . . . . . 112 5.3 Sufficient Optimality Conditions . . . . .. 130 5.4 Application to Optimal Control Problems . 140 Exercises. . 159 6 Duality 163 6.1 Problem Formulation . 163 6.2 Duality Theorems .. . 168 6.3 Saddle Point Theorems . 172 6.4 Linear Problems . . . . . 176 6.5 Application to Approximation Problems . 179 Exercises .................... . . 188 7 Direct Treatment of Special Optimization Problems 191 7.1 Linear Quadratic Optimal Control Problems . 191 7.2 Time Minimal Control Problems . . 199 Exercises . . . . . . . . . . . . . . . . . . 216 A Weak Convergence 219 B Reflexivity of Banach Spaces 221 C Hahn-Banach Theorem 223 D Partially Ordered Linear Spaces 227 Bibliography 231 Index 247 Chapter 1 Introduction and Problem Formulation In optimization one investigates problems of the determination of a minimal point of a functional on a nonempty subset of a real linear space. To be more specific this means: Let X be a real linear space, let S be a nonempty subset of X, and let f : S --+ lR be a given functional. We ask for the minimal points of f on S. An element x E S is called a minimal point of f on S if f(x) :S: f(x) for all xES. The set S is also called constraint set, and the functional f is called objective functional. In order to introduce optimization we present various typical op timization problems from Applied Mathematics. First we discuss a design problem from structural engineering. Example 1.1. As a simple example consider the design of a beam with a rectangular cross-section and a given length l (see Fig. 1.1 and 1.2). The height x and the width x have to be determined. 1 2 The design variables x and x have to be chosen in an area which 1 2 makes sense in practice. A certain stress condition must be satisfied, i.e. the ari"sing stresses cannot exceed a feasible stress. This leads to the inequality (1.1) 2 Chapter 1. Introduction and Problem Formulation Figure 1.1: Longitudinal section. Figure 1.2: Cross-section. Moreover, a certain stability of the beam must be guaranteed. In order to avoid a beam which is too slim we require (1.2) and (1.3) Finally, the design variables should be nonnegative which means (1.4) and (1.5) Among all feasible values for x and x we are interested in those 1 2 which lead to a light construction. Instead of the weight we can also take the volume of the beam given as lx x as a possible criterion 1 2 (where we assume that the material is homogeneous). Consequently, we minimize lx x subject to the constraints (1.1), ... ,(1.5). 1 2 With the next example we present a simple optimization problem from the calculus of variations. Example 1.2. In the calculus of variations one investigates, for instance, problems of minimizing a functional f given as Jb f(x) = l(x(t), x(t), t)dt a Chapter 1. Introduction and Problem Formulation 3 where -oo < a < b < oo and 1 is argumentwise continuous and continuously differentiable with respect to x and i:. A simple problem of the calculus of variations is the following: Minimize f subject to the class of curves from S := {x E C1[a, b] I x(a) = Xt and x(b) = x2} where x and x are fixed endpoints. 1 2 In control theory there are also many problems which can be for mulated as optimization problems. A simple problem of this type is given in the following example. Example 1.3. On the fixed time interval [0, 1] we investigate the linear system of differential equations ( ~1(t) ) = ( 0 1 ) ( Xt(t) ) + ( 0 ) u(t) t E (O, 1) x2(t) 0 0 x2(t) 1 ' with the initial condition -2J2). ( Xt(O) ) = ( sJ2 x2(0) With the aid of an appropriate control function u E C[O, 1] this dy namical system should be steered from the given initial state to a terminal state in the set In addition to this constraint a control function u minimizing the cost functional 1 f(u) = J(u(t))2dt 0 has to be determined. Finally we discuss a simple problem from approximation theory. Example 1.4. We consider the problem of the determination of a linear function which approximates the hyperbolic sine function on 4 Chapter 1. Introduction and Problem Formulation 4 f3 =sinh a: f3 = xa 3 2 x 1.600233 :::::J 1 0 1 2 Figure 1.3: Best approximation of sinh on [0, 2). the interval [0, 2) with respect to the maximum norm in a best way (see Fig. 1.3). So, we minimize max la:x -sinh a:l. a-E(0,2) This optimization problem can also be written as min >. subject to the constraints >. = maxa-E(0,2Jia:x - sinh a:l (X' ). ) E JR2• The preceding problem is equivalent to the following optimization problem which has infinitely many constraints: min >. subject to the constraints ax - sinh a: ~ ). } for all a: E [0, 2) ax- sinh a: ? ->. ( x, ). ) E JR2.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.