(14) OPTIMIZATION: Conjugate Gradient Descent

Making Gradient Descent to Converge Faster

Carla Martins

--

Conjugate Gradient Descent (CGD) is an extension of gradient descent that aims to optimize quadratic function efficiently. It leverages conjugate directions, which are directions that ensure convergence to the minimum in a finite number of steps for quadratic functions.

--

--

Carla Martins

Compulsive learner. Passionate about technology. Speaks C, R, Python, SQL, Haskell, Java and LaTeX. Interested in creating solutions.