King's College London

Research portal

Comparing dynamics: deep neural networks versus glassy systems

Research output: Contribution to journalArticlepeer-review

Marco Baity-Jesi, Giulio Biroli, Levent Sagun, Mario Geiger, Gerard Ben Arous, Chiara Cammarota, Yann LeCun, Stefano Spigler, Matthieu Wyart

Original languageEnglish
Article number124013
JournalJournal of Statistical Mechanics (JSTAT)
Issue number12
Early online date20 Dec 2020
Accepted/In press7 Jun 2019
E-pub ahead of print20 Dec 2020
PublishedDec 2020

King's Authors


We analyze numerically the training dynamics of deep neural networks (DNN) by using methods developed in statistical physics of glassy systems. The two main issues we address are (1) the complexity of the loss landscape and of the dynamics within it, and (2) to what extent DNNs share similarities with glassy systems. Our findings, obtained for different architectures and datasets, suggest that during the training process the dynamics slows down because of an increasingly large number of flat directions. At large times, when the loss is approaching zero, the system diffuses at the bottom of the landscape. Despite some similarities with the dynamics of mean-field glassy systems, in particular, the absence of barrier crossing, we find distinctive dynamical behaviors in the two cases, showing that the statistical properties of the corresponding loss and energy landscapes are different. In contrast, when the network is under-parametrized we observe a typical glassy behavior, thus suggesting the existence of different phases depending on whether the network is under-parametrized or over-parametrized.

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454