Loading Events
  • This event has passed.

Tam Lê (TSE) – “Nonsmooth implicit differentiation for Machine Learning and Optimization”

November 3, 2021 @ 2:00 pm - 3:00 pm | Organizers: François-Pierre Paty, Martin Mugnier, Nicolas Schreuder, Ines MOUTACHAKER

Statistics-Econometrics-Machine Learning Seminar.

Time: 2:00 pm – 3:00 pm
Date: 3rd of november 2021
Place: room 3001 and Online

Tam Lê (TSE) – “Nonsmooth implicit differentiation for Machine Learning and Optimization”

Abstract : The Implicit Function Theorem (IFT) has found several applications in machine learning such as hyperparameter optimization or implicit neural networks training. We will review the IFT and its extensions to nonsmooth functions. The IFT first ensures the uniqueness and regularity of the implicit function and then provides a calculus i.e. a formula to compute the gradient. Its extensions to the nonsmooth case still provide uniqueness and regularity but no calculus. Recalling usual tools from nonsmooth analysis such as the Clarke Jacobian, we will explain with an example why they fail to generalize the implicit differentiation formula. We propose a solution to this issue using the recent notion of conservative Jacobian. This is a notion of Jacobian for nonsmooth functions compatible with the compositional rules of differential calculus which justifies why we can replace the derivatives by Clarke Jacobians in the implicit differentiation formula. We will see how this theory allows to obtain a convergence result for gradient descents implemented with implicit differentiation. We conclude the talk showcasing pathological training trajectories one can have when implicit differentiation is applied naively.

Sponsors:

CREST