Introduction to Nonlinear Optimization: Theory, Algorithms, and Applications with MATLABThis book provides the foundations of the theory of nonlinear optimization as well as some related algorithms and presents a variety of applications from diverse areas of applied sciences. The author combines three pillars of optimization?theoretical and algorithmic foundation, familiarity with various applications, and the ability to apply the theory and algorithms on actual problems?and rigorously and gradually builds the connection between theory, algorithms, applications, and implementation. Readers will find more than 170 theoretical, algorithmic, and numerical exercises that deepen and enhance the reader's understanding of the topics. The author includes offers several subjects not typically found in optimization books?for example, optimality conditions in sparsity-constrained optimization, hidden convexity, and total least squares. The book also offers a large number of applications discussed theoretically and algorithmically, such as circle fitting, Chebyshev center, the Fermat?Weber problem, denoising, clustering, total least squares, and orthogonal regression and theoretical and algorithmic topics demonstrated by the MATLAB? toolbox CVX and a package of m-files that is posted on the book?s web site.
|
Contents
19781611973655ch1 | 1 |
19781611973655ch2 | 13 |
19781611973655ch3 | 37 |
19781611973655ch4 | 49 |
19781611973655ch5 | 83 |
19781611973655ch6 | 97 |
19781611973655ch7 | 117 |
19781611973655ch8 | 147 |
19781611973655ch9 | 169 |
19781611973655ch10 | 191 |
19781611973655ch11 | 207 |
19781611973655ch12 | 237 |
19781611973655bm | 275 |
Other editions - View all
Common terms and phrases
algorithm assume backtracking Cauchy–Schwarz inequality concave function condition number Consider the problem constant stepsize constraints continuously differentiable function conv(S convergence convex combination convex function convex optimization convex optimization problem convex set denoted descent direction diagonal dual problem eigenvalues equation exact line search Example exists f(xk Find a dual follows fun_val function defined function f function values given global minimum point grad gradient method gradient projection method hence Hessian implies int(C iter number iter_number KKT conditions KKT points Lagrangian least squares problem Lemma Let f line search linear linearly independent Lipschitz lower bound MATLAB matrix norm maximum minimization problem Newton’s method nonconvex nonempty nonnegative norm norm_grad objective function obtain optimal solution optimal value optimality conditions orthogonal projection positive definite positive semidefinite Proof prove quadratic function result satisfied sequence Show solve stationary point strictly convex Suppose variables vector Vf(x