cs.LG

KL Divergence Between Gaussians: A Step-by-Step Derivation for the Variational Autoencoder Objective

arXiv:2604.11744v1 Announce Type: new
Abstract: Kullback-Leibler (KL) divergence is a fundamental concept in information theory that quantifies the discrepancy between two probability distributions. In the context of Variational Autoencoders (VAEs), i…