Loading [MathJax]/extensions/MathMenu.js
Optimal Rates for Agnostic Distributed Learning | IEEE Journals & Magazine | IEEE Xplore

Optimal Rates for Agnostic Distributed Learning


Abstract:

The existing optimal rates for distributed kernel ridge regression (DKRR) often rely on a strict assumption, assuming that the true concept belongs to the hypothesis spac...Show More

Abstract:

The existing optimal rates for distributed kernel ridge regression (DKRR) often rely on a strict assumption, assuming that the true concept belongs to the hypothesis space. However, agnostic distributed learning is more common in practice, where the target regression may lie outside the kernel space. In this paper, we refine the excess risk bounds for DKRR and demonstrate that DKRR still achieve capacity-dependent optimal rates in the agnostic setting. Our theoretical findings indicate that the condition on the number of partitions not only influences computational efficiency but also impacts the range of situations where optimal rates are applicable. To relax the strict condition on the number of partitions, we first derive a sharper estimate for the difference between empirical and expected covariance operators. We then leverage additional unlabeled examples to reduce the label-independent error terms, further extending the optimal rates to more situations in the agnostic setting. In addition to the generalization error bounds in expectation, we also present refined excess risk bounds in high probability, where the optimal rates can also pertain to the agnostic setting. Finally, through both theoretical and empirical comparisons with related work, we demonstrate that our findings provide higher statistical applicability and computational advantages.
Published in: IEEE Transactions on Information Theory ( Volume: 70, Issue: 4, April 2024)
Page(s): 2759 - 2778
Date of Publication: 18 December 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.