- An Data-Theoretic Framework for Out-of-Distribution Generalization(arXiv)
Creator : Wenliang Liu, Guanding Yu, Lele Wang, Renjie Liao
Summary : We research the Out-of-Distribution (OOD) generalization in machine studying and suggest a normal framework that gives information-theoretic generalization bounds. Our framework interpolates freely between Integral Chance Metric (IPM) and f-divergence, which naturally recovers some recognized outcomes (together with Wasserstein- and KL-bounds), in addition to yields new generalization bounds. Furthermore, we present that our framework admits an optimum transport interpretation. When evaluated in two concrete examples, the proposed bounds both strictly enhance upon current bounds in some instances or recuperate the very best amongst current OOD generalization bounds