Overcoming Posterior Collapse in Variational Autoencoders Via EM-Type Training | IEEE Conference Publication | IEEE Xplore

Overcoming Posterior Collapse in Variational Autoencoders Via EM-Type Training


Abstract:

Variational autoencoders (VAE) are one of the most prominent deep generative models for learning the underlying statistical distribution of high-dimensional data. However...Show More

Abstract:

Variational autoencoders (VAE) are one of the most prominent deep generative models for learning the underlying statistical distribution of high-dimensional data. However, training VAEs suffers from a severe issue called posterior collapse; that is, the learned posterior distribution collapses to the assumed/pre-selected prior distribution. This issue limits the capacity of the learned posterior distribution to convey data information. Previous work has proposed a heuristic training scheme to mitigate this issue, in which the core idea is to train the encoder and the decoder in an alternating fashion. However, there is still no theoretical interpretation of this scheme, and this paper, for the first time, fills in this gap by inspecting the previous scheme under the lens of the expectation maximization (EM) framework. Under this framework, we propose a novel EM-type training algorithm that gives a controllable optimization process and it allows for further extensions, e.g., employing implicit distribution models. Experimental results have corroborated the superior performance of the proposed EM-type VAE training algorithm in terms of various metrics.
Date of Conference: 04-10 June 2023
Date Added to IEEE Xplore: 05 May 2023
ISBN Information:

ISSN Information:

Conference Location: Rhodes Island, Greece

Contact IEEE to Subscribe

References

References is not available for this document.