September 21
James Freitag, University of Illinois-Chicago
Model Theory And Machine Learning

Around 25 years ago, Laskowski noticed that the same combinatorial condition, the non-independence property (NIP), provides an important dividing line in both probably approximately correct (PAC) learning and model theory. In recent work, Hunter Chase and I use stability theoretic dividing lines to characterize learnability in various other settings of learning. For instance, a class is online learnable if and only if it is stable. In this talk, we will focus on query learning, where model theoretic techniques allow for a characterization of learnability and new learning algorithms using input from stability theory. We will assume basic familiarity with first order logic, but will introduce all notions we use from machine learning.