Description:The book presents a thorough development of the modern theory of stochastic approximation or recursive stochastic algorithms for both constrained and unconstrained problems. There is a complete development of both probability one and weak convergence methods for very general noise processes. The proofs of convergence use the ODE method, the most powerful to date, with which the asymptotic behavior is characterized by the limit behavior of a mean ODE. The assumptions and proof methods are designed to cover the needs of recent applications. The development proceeds from simple to complex problems, allowing the underlying ideas to be more easily understood. Rate of convergence, iterate averaging, high-dimensional problems, stability-ODE methods, two time scale, asynchronous and decentralized algorithms, general correlated and state-dependent noise, perturbed test function methods, and large devitations methods, are covered. Many motivational examples from learning theory, ergodic cost problems for discrete event systems, wireless communications, adaptive control, signal processing, and elsewhere, illustrate the application of the theory. This second edition is a thorough revision, although the main features and the structure remain unchanged. It contains many additional applications and results, and more detailed discussion. Harold J. Kushner is a University Professor and Professor of Applied Mathematics at Brown University. He has written numerous books and articles on virtually all aspects of stochastic systems theory, and has received various awards including the IEEE Control Systems Field Award. G. George Yin is a Professor of Mathematics at Wayne State University. His research interests focus on applied stochastic processes and stochastic systems theory; he has been a major contributor to stochastic approximation theory.