李宏毅深度学习-Improved GAN

1. What is f-divergence ?

《f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization》, NIPS, 2016

You can use any f-divergence!

P and Q are two distributions. p(x) and q(x) are the probability of sampling x .
Df(P||Q)=xq(x)f(p(x)q(x))dx

f is convex, f(1)=0

①if f(x)=xlogx ——(KL divergence)

Df(P||Q)=xq(x)p(x)q(x)log(p(x)q(x))dx=xp(x)log(p(x)q(x))dx

②if f(x)=logx ——( Reverse KL divergence)
Df(P||Q)=xq(x)(log(p(x)q(x)))dx=xq(x)log(q(x)p(x))dx

②if f(x)=(x1)2 ——( Chi Square divergence)
Df(P||Q)=xq(x)(p(x)q(x)1)2dx=x(p(x)q(x))2q(x)dx

2.Fenchel Conjugate (Fenchel共轭)

给定t时让f最大的x

俩近似相等

3. Connection with GAN

带入$f^*$

这里写图片描述
这里写图片描述

猜你喜欢

转载自blog.csdn.net/qq_26271435/article/details/78739057
今日推荐