- Replicable Learning of Huge-Margin Halfspaces(arXiv)
Author : Alkis Kalavasis, Amin Karbasi, Kasper Green Larsen, Grigoris Velegkas, Felix Zhou
Abstract : We provide atmosphere pleasant replicable algorithms for the problem of finding out large-margin halfspaces. Our outcomes improve upon the algorithms provided by Impagliazzo, Lei, Pitassi, and Sorrell [STOC, 2022]. We design the first dimension-independent replicable algorithms for this course of which runs in polynomial time, is right, and has strictly improved sample complexity compared with the one achieved by Impagliazzo et al. [2022] with respect to the entire associated parameters. Moreover, our first algorithm has sample complexity that is optimum with respect to the accuracy parameter ε. We moreover design an SGD-based replicable algorithm that, in some parameters’ regimes, achieves larger sample and time complexity than our first algorithm. Departing from the requirement of polynomial time algorithms, using the DP-to-Replicability low cost of Bun, Gaboardi, Hopkins, Impagliazzo, Lei, Pitassi, Sorrell, and Sivakumar [STOC, 2023], we current how one can obtain a replicable algorithm for large-margin halfspaces with improved sample complexity with respect to the margin parameter τ, nonetheless working time doubly exponential in 1/τ2 and worse sample complexity dependence on ε than one amongst our earlier algorithms. We then design an improved algorithm with larger sample complexity than all three of our earlier algorithms and working time exponential in 1/τ2.