Recent & Upcoming Talks

2020

How should we go about creating a science of deep learning? One might be tempted to focus on replicability, reproducibility, and …

The existence of adversarial examples in which tiny changes in the input can fool well trained neural networks has many applications …

2019

We examine gradient descent on unregularized logistic regression problems, with homogeneous linear predictors on linearly separable …

This talk will survey the role played by margins in optimization, generalization, and representation of neural networks. A specific …

Deep Learning has had phenomenal empirical successes in many domains including computer vision, natural language processing, and speech …

Classical theory that guides the design of nonparametric prediction methods like deep neural networks involves a tradeoff between the …

Much recent theoretical work has concentrated on “solving deep learning”. Yet, deep learning is not a thing in itself and …

Inductive biases from specific training algorithms like stochastic gradient descent play a crucial role in learning overparameterized …

Machine learning has made tremendous progress over the last decade. It’s thus tempting to believe that ML techniques are a …

Algorithms in deep learning have a regularization effect: different optimizers with different hyper-parameters, on the same training …