Machine Learning
Gradient Descent: negative gradient steps. Learning rate controls step size; momentum (β) smooths updates. Use Numeric deriv. for custom functions.