ebook img

Separable Optimization: Theory and Methods PDF

360 Pages·2021·4.364 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Separable Optimization: Theory and Methods

Springer Optimization and Its Applications 177 Stefan M. Stefanov Separable Optimization Theory and Methods Second Edition Springer Optimization and Its Applications Volume 177 Series Editors Panos M. Pardalos, University of Florida, Gainesville, FL, USA My T. Thai, University of Florida, Gainesville, FL, USA Honorary Editor Ding-Zhu Du, University of Texas at Dallas, Dallas, TX, USA Advisory Editors Roman V. Belavkin, Middlesex University, London, UK John R. Birge, University of Chicago, Chicago, IL, USA Sergiy Butenko, Texas A&M University, College Station, TX, USA Vipin Kumar, University of Minnesota, Minneapolis, MN, USA Anna Nagurney, University of Massachusetts, Amherst, MA, USA Jun Pei, Hefei University of Technology, Hefei, China Oleg Prokopyev, University of Pittsburgh, Pittsburgh, PA, USA Steffen Rebennack, Karlsruhe Institute of Technology, Karlsruhe, Germany Mauricio Resende, Amazon, Seattle, WA, USA Tamás Terlaky, Lehigh University, Bethlehem, PA, USA Van Vu, Yale University, New Haven, CT, USA Michael N. Vrahatis, University of Patras, Patras, Greece Guoliang Xue, Arizona State University, Tempe, AZ, USA Yinyu Ye, Stanford University, Stanford, CA, USA Aims and Scope Optimization has continued to expand in all directions at an astonishing rate. New algorithmicandtheoreticaltechniquesarecontinuallydevelopingandthediffusion into other disciplines is proceeding at a rapid pace, with a spot light on machine learning, artificial intelligence, and quantum computing. Our knowledge of all aspects of the field has grown even more profound. At the same time, one of the most striking trends in optimization is the constantly increasing emphasis on the interdisciplinarynatureofthefield.Optimizationhasbeenabasictoolinareasnot limited to applied mathematics, engineering, medicine, economics, computer science, operations research, and other sciences. TheseriesSpringerOptimizationandItsApplications(SOIA)aimstopublish state-of-the-art expository works (monographs, contributed volumes, textbooks, handbooks)thatfocusontheory,methods,andapplicationsofoptimization.Topics covered include, but are not limited to, nonlinear optimization, combinatorial optimization, continuousoptimization, stochastic optimization,Bayesian optimiza- tion,optimalcontrol,discreteoptimization,multi-objectiveoptimization,andmore. New to the series portfolio include Works at the intersection of optimization and machine learning, artificial intelligence, and quantum computing. Volumes from this series are indexed by Web of Science, zbMATH, Mathematical Reviews, and SCOPUS. More information about this series at http://www.springer.com/series/7393 Stefan M. Stefanov Separable Optimization Theory and Methods Second Edition 123 StefanM. Stefanov Department ofMathematics Faculty of Mathematics andNaturalSciences South-West University NeofitRilski Blagoevgrad,Bulgaria ISSN 1931-6828 ISSN 1931-6836 (electronic) SpringerOptimization andIts Applications ISBN978-3-030-78400-3 ISBN978-3-030-78401-0 (eBook) https://doi.org/10.1007/978-3-030-78401-0 MathematicsSubjectClassification: 90-02,90C08,90C15,90C25,90C30 Originallypublishedwiththetitle:SeparableProgramming 1stedition:©SpringerScience+BusinessMediaDordrecht2001 2ndedition:©TheEditor(s)(ifapplicable)andTheAuthor(s),underexclusivelicensetoSpringer NatureSwitzerlandAG2021 Thisworkissubjecttocopyright.AllrightsaresolelyandexclusivelylicensedbythePublisher,whether thewholeorpartofthematerialisconcerned,specificallytherightsoftranslation,reprinting,reuseof illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmissionorinformationstorageandretrieval,electronicadaptation,computersoftware,orbysimilar ordissimilarmethodologynowknownorhereafterdeveloped. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publicationdoesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfrom therelevantprotectivelawsandregulationsandthereforefreeforgeneraluse. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained hereinorforanyerrorsoromissionsthatmayhavebeenmade.Thepublisherremainsneutralwithregard tojurisdictionalclaimsinpublishedmapsandinstitutionalaffiliations. ThisSpringerimprintispublishedbytheregisteredcompanySpringerNatureSwitzerlandAG Theregisteredcompanyaddressis:Gewerbestrasse11,6330Cham,Switzerland To my parents, in memoriam, and to my sister Krassimira Preface to the Second Edition Twenty years have passed since the first edition of the book Separable Programming:TheoryandMethods.Duringthisperiodoftime,manyworksinthe areaofseparableoptimizationhaveappeared.Thisisthemotivationforpreparinga new edition of this book. Some chapters of the original edition are revised and supplemented with new sections, and three new chapters and three new appendices are included in this edition. NewresultsofnumericalexperimentsareincludedinSect.11.2ofChap.11and at the end of Sect. 13.3 of Chap. 13. In Chap. 12—Applications of Convex Separable Unconstrained Nondif- ferentiable Optimization to Approximation Theory—the approach, considered in Sects. 12.1–12.3,isextended tothe case of numerical solution ofsome systems of nonlinear algebraic equations (new Sect. 12.4) and systems of nonlinear equations definedbyconvexfunctions(newSect.12.5).Section12.4oftheoriginaleditionis renumbered as Sect. 12.6 and revised with some new computational results. Chapter14—ValidInequalities,CuttingPlanes,andIntegralityoftheKnapsack Polytope—is supplemented with results concerning valid inequalities generation (new Sects. 14.1 and 14.2), and the original text of this chapter concerning inte- grality of the knapsack polytope of the first edition constitutes now Sect. 14.3, complemented with the usage of totally unimodular matrices. New Chap. 15—Relaxation of the Equality Constrained Convex Continuous Knapsack Problem—is devoted to further characterization of the optimal solution to the problem ðC¼Þ of Chap. 6, defined by a convex separable objective function subject to linear equality constraint and box constraints, through the optimal solution of a relaxed version of this problem. In Chap. 16—On the Solution of the Multidimensional Convex Separable ContinuousKnapsackProblemwithBoundedVariables,anecessaryandsufficient optimality condition (characterization theorem) for a feasible solution to be an optimalsolutiontotheso-calledmultidimensionalknapsackproblemwithbounded variables is formulated and proved, along with a primal-dual analysis. vii viii PrefacetotheSecondEdition In Chap. 17—Characterization of the Optimal Solution of the Convex Generalized Nonlinear Transportation Problem, the convex generalized nonlinear transportation problem and the convex generalized nonlinear transportation prob- lemwith boxconstraints areconsidered. Anecessaryandsufficientcondition fora feasible solutiontobeanoptimalsolutionfor eachofthese twoproblems isstated and proved. Optimization problems and methods for solving these problems, considered in this book, are interesting not only from the viewpoint of optimization theory, optimization methods, and their applications but also from the viewpoint of other fields of science, especially computer science. Artificial intelligence, machine learning,andotherareasofcomputerscienceusevariousoptimizationmethods,for example, convex optimization methods, large-scale optimization methods, discrete (including integer) optimization methods, stochastic gradient descent methods (including stochastic quasigradient methods), etc. In particular, machine learning algorithms can be formulated as optimization algorithms. These applications revived the optimization methods in the past few years. InthenewAppendixE,thesolvabilityofaquadraticoptimizationproblemwith a feasible region defined as a Minkowski sum of a compact set and finitely gen- erated convex closed cone is discussed. In Appendix F, the Cauchy-Schwarz inequality, combined with properties of induced matrix norms, is used for solving a quadratic optimization problem. In Appendix G, statements and proofs of some theorems of the alternative are presented. Some minor revisions are made in the text of some chapters of the original edition. Thebibliographysectionispartiallyupdatedwithnewtitlesandneweditionsof someoftheoriginallyincludedreferencetitles.BibliographicalnotesafterChap.1, Parts I, II, and III are updated accordingly. IamgratefultoSeriesEditorsofSpringerOptimizationandItsApplications,to the three anonymous reviewers, as well as to Elizabeth Loew of Springer for their help in preparing this edition for publication. Blagoevgrad, Bulgaria Stefan M. Stefanov March 2021 Preface to the First Edition Mathematical optimization(mathematicalprogramming)dealswiththeproblemof optimizing (minimizing or maximizing) a function called the objective function subjecttoequalityand/orinequalityconstraintsthataredefinedbyfunctionscalled the constraint functions. In this book we consider a branch of mathematical programming – separable programming, where the objective function and the constraint functions can be expressed as the sum of single-variable functions. Such functions are said to be separable.Duetoseparability,separableprogramshavesomeinterestingproperties andcanbesolvedbyspecificmethods.Furthermore,manyeconomic,industrialand other problems are described mathematically by separable programs. Thus, sepa- rable programming is significant from both the theoretical and practical points of view. We also study one of the most important cases of separable programming – convex separable programming where the objective function and the constraint functionsareconvex.Convexfunctionshavemanyspecialproperties,forexample, any local minimum of a convex function over a convex set is also a global mini- mum,andoptimalitycriteriaforconvexprogramsarebothnecessaryandsufficient conditions. Some general results for separable programming are presented, techniques of approximating the original separable problem by linear program and the simplex method with the restricted basis entry rule for solving the resulting linear program arediscussed.Duetoconvexity,linearprogramsthatapproximateconvexseparable problems can be solved by the standard simplex method, discarding the restricted basis entry rule. Because we solve the original separable problem by solving the approximate problem, then methods employing this approach are approximate. Some error estimations of approximating procedure for the convex separable problems are presented. The dynamic programming approach to separable programs is also considered and some separable inventory and other models are reviewed. ix x PrefacetotheFirstEdition The second part of this book is devoted to some special convex separable programs – minimization problems with convex and separable objective function over a feasible region defined by a separable convex inequality constraint of the form“(cid:2)”/linearequalityconstraint/linearinequalityconstraintoftheform“(cid:3)”, and bounds on the variables. The three problems are denoted by ðCÞ, ðC¼Þ and ðC(cid:3)Þ, respectively. A problem ðC¼Þ which is a generalization of ðC¼Þ with m m equality constraints is also considered. Theseproblemshavebeenthesubjectofintensivestudyinthelast35–40years because they are interesting from both the theoretical and practical points of view. Problems of these and related types arise in many cases, for example, in pro- duction planning and scheduling, in allocation offinancial resources, in allocation of promotional resources among competing activities, in the theory of search, in subgradient optimization, in facility location, in the implementation of projection methods when the feasible region has the same form as the feasible sets under consideration, etc. That is why we need results and effective methods for solving such problems. In this book, new iterative algorithms of polynomial complexity for problems ðCÞ, ðC¼Þ and ðC(cid:3)Þ are proposed. Someapplicationsofthesealgorithmsarepresented,inparticular,applicationto stochasticquasigradientmethods.Numericalapproximationswithrespectto‘ -and 1 ‘1-norms, as convex separable unconstrained nondifferentiable programs, are also considered. This book consists of fourteen chapters, combined into three parts, and four appendices. InChapterOne–Preliminaries:ConvexAnalysisandConvexProgramming,we give some definitions and results connected with convex analysis (convex sets, projection of a point onto a set, separation of sets, convex functions, subgradients, subdifferentials, directional derivatives, etc.), convex programming and the Lagrangian duality. In Part Two, these concepts and results are utilized in devel- oping suitable optimality conditions and numerical methods for solving convex separableproblems.ChapterOnecanalsobeusedasanintroductorytextinconvex analysis and convex programming. Chapters Two, Three and Four constitute Part One – Separable Programming. InChapterTwo–Introduction.ApproximatingtheSeparableProblem,aseparable nonlinear program is defined, techniques of approximating the original separable problem by linear program are discussed, and the restricted basis entry rule of the simplex method for solving the linear approximating problem is considered. In Chapter Three, some results for the convex separable case are presented. ChapterFour–SeparableProgramming:ADynamicProgrammingApproachis devoted to the dynamic programming approach to separable programming, the problem of dimensionality in dynamic programming (the curse of dimensionality, R. Bellman), and application of the Lagrange multiplier method for reducing the dimensionalityoftheproblem.Adynamicprogrammingapproachtotransportation

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.