How to optimize without derivatives, and how to handle nonlinear constraints via penalty and augmented Lagrangian methods.
Approximate gradients using function evaluations alone. Step size selection, the error V-shape, and cost analysis.
Start here →Fit a local quadratic model by interpolation. Parameter counting, cost analysis, and the trust-region approach.
Explore →Coordinate search and pattern search methods. Fixed directions, stencils, sufficient decrease, and the Zoutendijk condition.
Explore →The simplex "amoeba" method. Reflection, expansion, contraction, and shrink. Interactive demo on multiple test functions.
Try it →Constrained optimization setup: equality, inequality, and general problems. Why constraints make things NP-hard and the Inception analogy.
Discover →Penalty methods, ill-conditioning, the hanging net problem, and augmented Lagrangian methods that fix the weaknesses.
Explore →