Member-only story

(14) OPTIMIZATION: Conjugate Gradient Descent

Making Gradient Descent to Converge Faster

Carla Martins
7 min readMar 22, 2024

Conjugate Gradient Descent (CGD) is an extension of gradient descent that aims to optimize quadratic function efficiently. It leverages conjugate directions, which are directions that ensure convergence to the minimum in a finite number of steps for quadratic functions.

Before we continue a revision on Linear Algebra is recommended. You can follow my articles that will guide you through the basics:

--

--

Carla Martins
Carla Martins

Written by Carla Martins

Compulsive learner. Passionate about technology. Speaks C, R, Python, SQL, Haskell, Java and LaTeX. Interested in creating solutions.

No responses yet