Skip to main content

Multiple-Source Adaptation Using Variational Rényi Bound Optimization

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases: Research Track (ECML PKDD 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14173))

  • 626 Accesses

Abstract

Multiple Source Adaptation (MSA) is a problem that involves identifying a predictor which minimizes the error for the target domain while utilizing the predictors from the source domains. In practice, the source domains typically exhibit varying probability distributions across the input space and are unknown to the learner. Consequently, accurate probability estimates are essential for effectively addressing the MSA problem. To this end, variation inference is an attractive approach that aims to approximate probability densities. Traditionally, it is done by maximizing a lower bound for the likelihood of the observed data (evidence), i.e. maximizing the Evidence Lower BOund (ELBO). Recently, researchers have proposed optimizing the Variational Rényi bound (VR) instead of ELBO, which can be biased or difficult to approximate due to high variance. To address these issues, we propose a new upper bound called Variational Rényi Log Upper bound (VRLU). Unlike existing VR bounds, the VRLU bound maintains the upper bound property when using the Monte Carlo (MC) approximation. Additionally, we introduce the Variational Rényi Sandwich (VRS) method, which jointly optimizes an upper and a lower bound, resulting in a more accurate density estimate. Following this, we apply the VRS density estimate to the MSA problem. We show, both theoretically and empirically, that using VRS estimators provides tighter error bounds and improved performance, compared to leading MSA methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    A good predictor: a predictor that provides a small error with respect to the target domain.

  2. 2.

    https://github.com/DanaOshri/Multiple-Source-Adaptation-using-Variational-R-nyi-Bound-Optimization/blob/main/Appendix.pdf.

  3. 3.

    Proofs for Corollaries 1, 2, and 3 are detailed in Appendix D.

  4. 4.

    All experiments were conducted using PyTorch;

    https://github.com/DanaOshri/Multiple-Source-Adaptation-using-Variational-R-nyi-Bound-Optimization.

References

  1. Cheung, W.: Generalizations of hölders inequality. Sciences 26(1), 7–10 (2001)

    MathSciNet  MATH  Google Scholar 

  2. Cortes, C., Mohri, M., Suresh, A.T., Zhang, N.: A discriminative technique for multiple-source adaptation. In: International Conference on Machine Learning, pp. 2132–2143. PMLR (2021)

    Google Scholar 

  3. Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Probab., 146–158 (1975)

    Google Scholar 

  4. French, G., Mackiewicz, M., Fisher, M.: Self-ensembling for visual domain adaptation. In: International Conference on Learning Representations (2018)

    Google Scholar 

  5. Hoffman, J., Mohri, M., Zhang, N.: Algorithms and theory for multiple-source adaptation. In: Advances in Neural Information Processing Systems, vol. 31 (2018)

    Google Scholar 

  6. Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. Mach. Learn. 37(2), 183–233 (1999)

    Article  MATH  Google Scholar 

  7. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  8. Kingma, D.P., Welling, M.: Auto-encoding variational bayes. Stat 1050, 1 (2014)

    MATH  Google Scholar 

  9. Li, Y., Turner, R.E.: Rényi divergence variational inference. In: Advances in Neural Information Processing Systems, vol. 29 (2016)

    Google Scholar 

  10. MacKay, D.J.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)

    Google Scholar 

  11. Mansour, Y., Mohri, M., Rostamizadeh, A.: Domain adaptation with multiple sources. In: Advances in Neural Information Processing Systems, vol. 21 (2008)

    Google Scholar 

  12. Mansour, Y., Mohri, M., Rostamizadeh, A.: Multiple source adaptation and the Rényi divergence. In: Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence, pp. 367–374 (2009)

    Google Scholar 

  13. Poole, B., Ozair, S., Van Den Oord, A., Alemi, A., Tucker, G.: On variational bounds of mutual information. In: International Conference on Machine Learning, pp. 5171–5180. PMLR (2019)

    Google Scholar 

  14. Saenko, K., Kulis, B., Fritz, M., Darrell, T.: Adapting visual category models to new domains. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6314, pp. 213–226. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15561-1_16

    Chapter  Google Scholar 

  15. Shekhovtsov, A., Schlesinger, D., Flach, B.: VAE approximation error: Elbo and exponential families. In: International Conference on Learning Representations (ICLR) (2021)

    Google Scholar 

  16. Van Erven, T., Harremos, P.: Rényi divergence and Kullback-Leibler divergence. IEEE Trans. Inf. Theory 60(7), 3797–3820 (2014)

    Article  MATH  Google Scholar 

  17. Zhang, C., Bütepage, J., Kjellström, H., Mandt, S.: Advances in variational inference. IEEE Trans. Pattern Anal. Mach. Intell. 41(8), 2008–2026 (2018)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dana Zalman (Oshri) .

Editor information

Editors and Affiliations

Ethics declarations

Ethical Statement:

There are no ethical issues.

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zalman (Oshri), D., Fine, S. (2023). Multiple-Source Adaptation Using Variational Rényi Bound Optimization. In: Koutra, D., Plant, C., Gomez Rodriguez, M., Baralis, E., Bonchi, F. (eds) Machine Learning and Knowledge Discovery in Databases: Research Track. ECML PKDD 2023. Lecture Notes in Computer Science(), vol 14173. Springer, Cham. https://doi.org/10.1007/978-3-031-43424-2_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43424-2_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43423-5

  • Online ISBN: 978-3-031-43424-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics