« search calendars« DIMACS/TRIPODS Workshop on Optimization and Machine Learning

« Risk Bounds for Classification and Regression Models that Interpolate

Risk Bounds for Classification and Regression Models that Interpolate

August 13, 2018, 12:00 PM - 12:30 PM

Location:

Iacocca Hall

Lehigh University

Bethlehem PA

Click here for map.

Daniel Hsu, Columbia University

Recent experiments with non-linear machine learning methods demonstrate the generalization ability of classification and regression models that interpolate noisy training data. It is difficult for existing generalization theory to explain these observations. On the other hand, there are classical examples of interpolating methods with non-trivial risk guarantees, the nearest neighbor rule being one of the most well-known and best-understood. I'll describe a few other such interpolating methods (old and new) with stronger risk guarantees compared to nearest neighbor in high dimensions.

This is based on joint work with Misha Belkin (The Ohio State University) and Partha Mitra (Cold Spring Harbor Laboratory).