This repository includes all codes implementation of a graduate-level optimization for machine learning course.
It focuses on unconstrained first-order methods which are frequently used in area of machine learning and data science.
- Gradient Descent for Linear Regression.
- Projected Gradient Descent Methods
- Amijio Line Search for logistic regression (in matlab version)
- mirror descent algorithm
- proximal gradient descent
- accelerated gradient descent (heavy-ball methods and Nesterov acceleration)
- ADMM algorithm (rPCA for video object detection)
- SGD algorithm: sample with/without replacement, average SGD, momentum acceleration.
For any college faculties or students who want to use these codes during class or homework, you can directly use them for reference or demo.