Close to-Optimum Bounds for Studying Gaussian Halfspaces with Random Classification Noise
Authors: Ilias Diakonikolas, Jelena Diakonikolas, Daniel M. Kane, Puqian Wang, Nikos Zarifis
Summary: We research the issue of studying basic (i.e., not essentially homogeneous) halfspaces with Random Classification Noise beneath the Gaussian distribution. We set up nearly-matching algorithmic and Statistical Question (SQ) decrease sure outcomes revealing a shocking information-computation hole for this fundamental drawback. Particularly, the pattern complexity of this studying drawback is Θ˜(d/ε), the place d is the dimension and ε is the surplus error. Our constructive result’s a computationally environment friendly studying algorithm with pattern complexity O~(d/ε+d/(max{p,ε})2), the place p quantifies the bias of the goal halfspace. On the decrease sure facet, we present that any environment friendly SQ algorithm (or low-degree take a look at) for the issue requires pattern complexity a minimum of Ω(d1/2/(max{p,ε})2). Our decrease sure means that this quadratic dependence on 1/ε is inherent for environment friendly algorithm