HAX606X - Convex Optimization (2021-2024)

This is an undergraduate course (in French!) introducing standard techniques from convex optimization. Numerical elements are provided in Python. Codes and questions are written with Joseph Salmon and Amélie Vernay. Main course is given by Fabien Marche.

\[f(x, y) = \frac{xy}{1+e^{x^2 - y^2}}\]
surface

2023-2024

TP0: Installations

TP1: Introduction to Python

TP2: Algorithmes d’optimisation 1D

TP3: Descentes de gradient et variantes

TP4: Descente de gradient projeté

TP5: Différentiation automatique avec Pytorch

Annales

2022-2023

Click to expand

TP0: Installations

TP1: Introduction to Python

TP2: Algorithmes d’optimisation 1D

TP3: Méthode de descente de gradient

TP4: Descente de gradient projeté

Annales

2021-2022

Click to expand

TP1: Introduction to Python

  • sujet: [pdf]
  • code: [py]
  • TP2: First 1D algorithms: bissection and golden search methods

  • sujet: [pdf]
  • TP3: Gradient descent and coordinate descent

  • sujet: [pdf]
  • widgets: [fonctions] [widget_level_set] [widget_convergence]
  • It is necessary to have an up-to-date version of matplotlib to run the widgets. Numba and Ipython are also used. This is the corner stone of the course !!

    TP4: Projected gradient descent and application

  • sujet: [pdf]
  • widgets: [fonctions] [widget_level_set] [widget_convergence] (same as TP3, but still relevant!)
  • dataset: [iowa_alcohol]
  • script with dataset: [alcohol_script]
  • The dataset available here is an already preprocessed and subdataset of the original IowaLiquor dataset (link in the alcohol script file).

    Updated: