Member-only story
(14) OPTIMIZATION: Conjugate Gradient Descent
Making Gradient Descent to Converge Faster
7 min readMar 22, 2024
Conjugate Gradient Descent (CGD) is an extension of gradient descent that aims to optimize quadratic function efficiently. It leverages conjugate directions, which are directions that ensure convergence to the minimum in a finite number of steps for quadratic functions.
Before we continue a revision on Linear Algebra is recommended. You can follow my articles that will guide you through the basics: