Scaling Limits of Neural Networks

PACM Colloquium
Sep 23, 2024
4:30 - 5:30 pm
214 Fine Hall
 
Abstract: Neural networks are often studied analytically through scaling limits: regimes in which taking to infinity  structural network parameters such as depth, width, and number of training datapoints results in simplified models of learning. I will survey several such approaches with the goal of illustrating the rich and still not fully understood space of possible behaviors when some or all of the network’s structural parameters are large.