The most basic algorithm is Gradient Descent to optimize your cost function in a machine learning algorithm but it has some problems like;

- Choosing alpha value for the steps of convergence process.
- Too much iteration for converge.

There are some other approaches that are;

- Conjugate Gradient
- BFGS
- L-BFGS

Continue reading Other algorithms for optimization cost func. in Machine Learning