This talk gives an overview of distribution-free predictive inference, using conformal prediction. Conformal prediction essentially acts as a wrapper on top of any prediction algorithm, whether it be for a regression (continuous outcome) or classification (discrete outcome) problem. It delivers prediction sets with finite-sample marginal validity under the assumption of i.i.d. data (training set and test point), and beyond that, it places no assumptions whatsoever on the joint distribution governing the features and the outcome variable. While the method itself dates back to Vovk and coauthors in the late 1990s, it has recently witnessed an explosion of interest in the last 5+ years in statistics and machine learning. We will describe some recent advances in conformal prediction (including extensions beyond i.i.d. data streams) as well as ongoing challenges.