Lorentz Center - Theoretical Foundations for Learning from Easy Data from 7 Nov 2016 through 11 Nov 2016
  Current Workshop  |   Overview   Back  |   Home   |   Search   |     

    Theoretical Foundations for Learning from Easy Data
    from 7 Nov 2016 through 11 Nov 2016

 

Description and aim

There exist a plethora of conditions (such as margin conditions in classification, exp-concavity of the losses in sequence prediction and perturbation robustness for clustering) under which learning becomes easier than in the worst-case.  This workshop investigates how reasonable such conditions really are, and aims to further develop algorithms that simultaneously exploit easy situations while still being close to worst-case optimal.



   [Back]