Ï㽶ÊÓƵ

Event

Quentin Bertrand, Mila

Monday, September 19, 2022 16:00to17:00

Title:ÌýImplicit Differentiation in Non-Smooth Convex Learning.

´¡²ú²õ³Ù°ù²¹³¦³Ù:ÌýFinding the optimal hyperparameters of a model can be cast as a bilevel optimization problem, typically solved zero-order techniques. In this work we study first-order methods when the inner optimization problem is convex but non-smooth. We show that the forward-mode differentiation of proximal gradient descent and proximal coordinate descent yield sequences of Jacobians converging toward the exact Jacobian. Using implicit differentiation, we show it is possible to leverage the non-smoothness of the inner problem to speed up the computation. Finally, we provide a bound on the error made on the hypergradient when the inner optimization problem is solved approximately. Results on regression and classification problems reveal computational benefits for hyperparameter optimization, especially when multiple hyperparameters are required.

Ìý

Ìý

For Zoom Seminar Applied MathematicsÌý

Please contact : damien.tageddine [at] mail.mcgill.ca

Follow us on

Back to top