Article
Title: "An f-divergence Analysis of Generative Adversarial Network"
Authors: Mahmud Hasan, Hailin Sang
Pages: 451-472
DOI: 10.2478/fcds-2025-0018
Abstract:

We aim to establish estimation bounds for various divergences, including total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson x^2 divergence, within the GAN estimator. We derive an inequality based on empirical and population objective functions of the GAN model, achieving almost surely convergence rates. Subsequently, this inequality was employed to derive estimation bounds for total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson x^2 divergence, leading to almost surely convergence rates and differences between the expected outputs of the discriminator on real data and generated data. Our study demonstrates better results compared to some existing ones, which are a specific case of the general objective function.

Open access to full text at De Gruyter Online