Derivative-Free Optimization & Nonlinear Programming

How to optimize without derivatives, and how to handle nonlinear constraints via penalty and augmented Lagrangian methods.

Part 1

Finite Differences

Approximate gradients using function evaluations alone. Step size selection, the error V-shape, and cost analysis.

Start here →
Part 2

Quadratic Models

Fit a local quadratic model by interpolation. Parameter counting, cost analysis, and the trust-region approach.

Explore →
Part 3

Direct Search

Coordinate search and pattern search methods. Fixed directions, stencils, sufficient decrease, and the Zoutendijk condition.

Explore →
Part 4

Nelder-Mead

The simplex "amoeba" method. Reflection, expansion, contraction, and shrink. Interactive demo on multiple test functions.

Try it →
Part 5

Nonlinear Programming

Constrained optimization setup: equality, inequality, and general problems. Why constraints make things NP-hard and the Inception analogy.

Discover →
Part 6

Penalty & Augmented Lagrangian

Penalty methods, ill-conditioning, the hanging net problem, and augmented Lagrangian methods that fix the weaknesses.

Explore →
CS 520 · Spring 2026