Abstract
Multiple Source Adaptation (MSA) is a problem that involves identifying a predictor which minimizes the error for the target domain while utilizing the predictors from the source domains. In practice, the source domains typically exhibit varying probability distributions across the input space and are unknown to the learner. Consequently, accurate probability estimates are essential for effectively addressing the MSA problem. To this end, variation inference is an attractive approach that aims to approximate probability densities. Traditionally, it is done by maximizing a lower bound for the likelihood of the observed data (evidence), i.e. maximizing the Evidence Lower BOund (ELBO). Recently, researchers have proposed optimizing the Variational Rényi bound (VR) instead of ELBO, which can be biased or difficult to approximate due to high variance. To address these issues, we propose a new upper bound called Variational Rényi Log Upper bound (VRLU). Unlike existing VR bounds, the VRLU bound maintains the upper bound property when using the Monte Carlo (MC) approximation. Additionally, we introduce the Variational Rényi Sandwich (VRS) method, which jointly optimizes an upper and a lower bound, resulting in a more accurate density estimate. Following this, we apply the VRS density estimate to the MSA problem. We show, both theoretically and empirically, that using VRS estimators provides tighter error bounds and improved performance, compared to leading MSA methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
A good predictor: a predictor that provides a small error with respect to the target domain.
- 2.
- 3.
- 4.
All experiments were conducted using PyTorch;
https://github.com/DanaOshri/Multiple-Source-Adaptation-using-Variational-R-nyi-Bound-Optimization.
References
Cheung, W.: Generalizations of hölders inequality. Sciences 26(1), 7–10 (2001)
Cortes, C., Mohri, M., Suresh, A.T., Zhang, N.: A discriminative technique for multiple-source adaptation. In: International Conference on Machine Learning, pp. 2132–2143. PMLR (2021)
Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Probab., 146–158 (1975)
French, G., Mackiewicz, M., Fisher, M.: Self-ensembling for visual domain adaptation. In: International Conference on Learning Representations (2018)
Hoffman, J., Mohri, M., Zhang, N.: Algorithms and theory for multiple-source adaptation. In: Advances in Neural Information Processing Systems, vol. 31 (2018)
Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. Mach. Learn. 37(2), 183–233 (1999)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Kingma, D.P., Welling, M.: Auto-encoding variational bayes. Stat 1050, 1 (2014)
Li, Y., Turner, R.E.: Rényi divergence variational inference. In: Advances in Neural Information Processing Systems, vol. 29 (2016)
MacKay, D.J.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)
Mansour, Y., Mohri, M., Rostamizadeh, A.: Domain adaptation with multiple sources. In: Advances in Neural Information Processing Systems, vol. 21 (2008)
Mansour, Y., Mohri, M., Rostamizadeh, A.: Multiple source adaptation and the Rényi divergence. In: Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence, pp. 367–374 (2009)
Poole, B., Ozair, S., Van Den Oord, A., Alemi, A., Tucker, G.: On variational bounds of mutual information. In: International Conference on Machine Learning, pp. 5171–5180. PMLR (2019)
Saenko, K., Kulis, B., Fritz, M., Darrell, T.: Adapting visual category models to new domains. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6314, pp. 213–226. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15561-1_16
Shekhovtsov, A., Schlesinger, D., Flach, B.: VAE approximation error: Elbo and exponential families. In: International Conference on Learning Representations (ICLR) (2021)
Van Erven, T., Harremos, P.: Rényi divergence and Kullback-Leibler divergence. IEEE Trans. Inf. Theory 60(7), 3797–3820 (2014)
Zhang, C., Bütepage, J., Kjellström, H., Mandt, S.: Advances in variational inference. IEEE Trans. Pattern Anal. Mach. Intell. 41(8), 2008–2026 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Ethics declarations
Ethical Statement:
There are no ethical issues.
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zalman (Oshri), D., Fine, S. (2023). Multiple-Source Adaptation Using Variational Rényi Bound Optimization. In: Koutra, D., Plant, C., Gomez Rodriguez, M., Baralis, E., Bonchi, F. (eds) Machine Learning and Knowledge Discovery in Databases: Research Track. ECML PKDD 2023. Lecture Notes in Computer Science(), vol 14173. Springer, Cham. https://doi.org/10.1007/978-3-031-43424-2_20
Download citation
DOI: https://doi.org/10.1007/978-3-031-43424-2_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-43423-5
Online ISBN: 978-3-031-43424-2
eBook Packages: Computer ScienceComputer Science (R0)