A Log-likelihood Regularized KL Divergence for Video Prediction With a 3D Convolutional Variational Recurrent Network.
Haziq RazaliBasura FernandoPublished in: WACV (Workshops) (2021)
Keyphrases
- log likelihood
- kl divergence
- exponential family
- information theoretic
- bregman divergences
- recurrent networks
- maximum likelihood
- density estimation
- information theory
- kullback leibler divergence
- video sequences
- recurrent neural networks
- feed forward
- mutual information
- scoring function
- variational methods
- mahalanobis distance
- biologically inspired
- closed form
- computer vision
- neural network
- em algorithm
- image segmentation
- statistical models
- least squares
- optical flow
- prior knowledge
- information retrieval
- machine learning