Repository for homework for the course "Optimization Methods" at HSE University, spring 2023
The first laboratory work is devoted to optimization methods such as gradient descent and Newton's method. I implemented the methods themselves, the necessary oracles, as well as the linear search procedure for adaptive selection of the algorithm step (Armijo, Wolfe conditions). Experiments were also carried out, the description and results of which can be found in the report.
In the second laboratory work, to solve the nonlinear unconstrained optimization problem, the conjugate gradient method, the Truncated Newton (or Hessian Free Newton) method, and the L-BFGS method were implemented.
The third laboratory work examined methods for solving the composite optimization problem: subgradient method, proximal gradient method, Nesterov’s fast gradient method.