BEGIN:VCALENDAR VERSION:2.0 PRODID:-//CREST - ECPv5.1.3//NONSGML v1.0//EN CALSCALE:GREGORIAN METHOD:PUBLISH X-WR-CALNAME:CREST X-ORIGINAL-URL:https://crest.science X-WR-CALDESC:Events for CREST BEGIN:VTIMEZONE TZID:Europe/Helsinki BEGIN:DAYLIGHT TZOFFSETFROM:+0200 TZOFFSETTO:+0300 TZNAME:EEST DTSTART:20220327T010000 END:DAYLIGHT BEGIN:STANDARD TZOFFSETFROM:+0300 TZOFFSETTO:+0200 TZNAME:EET DTSTART:20221030T010000 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTART;TZID=Europe/Helsinki:20221121T140000 DTEND;TZID=Europe/Helsinki:20221121T151500 DTSTAMP:20240329T001306 CREATED:20221116T074829Z LAST-MODIFIED:20221116T074829Z UID:14276-1669039200-1669043700@crest.science SUMMARY:Fanny YANG (ETH Zurich)- " How the strength of the inductive bias affects the generalization performance of interpolators " DESCRIPTION:Statistical Seminar: Every Monday at 2:00 pm.\nTime: 2:00 pm – 3:15 pm\nDate: 21th of November 2022\nPlace: salle 3001 \nFanny YANG (ETH Zurich)- ” How the strength of the inductive bias affects the generalization performance of interpolators ” \nAbstract:Interpolating models have recently gained popularity in the statistical learning community due to common practices in modern machine learning: complex models achieve good generalization performance despite interpolating high-dimensional training data. In this talk\, we prove generalization bounds for high-dimensional linear models that interpolate noisy data generated by a sparse ground truth. In particular\, we first show that minimum-l1-norm interpolators achieve high-dimensional asymptotic consistency at a logarithmic rate. Further\, as opposed to the regularized or noiseless case\, for min-lp-norm interpolators with 1