Abstract:
|
Deep sequential generative models have been used for various fields such as geoscience, materials informatics, and computer vision. This talk focuses on sequential variational auto-encoders (SVAEs) and variational inference (VI) methods for improving parameter learning. SVAEs expand variational auto-encoders into sequential structures. VI is a learning technique by maximizing evidence lower bound, a lower bound of the log marginal likelihood, instead of the intractable maximization. Some previous works propose VI combined with sequential Monte Carlo (SMC) to obtain a tighter lower bound and enhance parameter learning. These works have two drawbacks: low particle diversity and biased gradient estimates. Particle diversity means the representation capability of latent distribution by an ensemble of particles. The biased gradient estimates provide different learning directions from the correct directions. Thus, we propose a new VI method combined with the ensemble Kalman filter to overcome these drawbacks. The proposed method outperforms the previous methods regarding predicting ability and particle diversity. Detailed experimental results will be shown on the day.
|