Glow introduce

Let XXX is a real high-dimensional random vector with unknown distribution, denoted asX ∼ p ∗ ( X ) X\sim p^∗(X)Xp (X). We collected the independent and identically distributed datasetDDD , we choose with parametersθ θModel of θ p θ ( X ) p_θ ( X )pi( X ) , assuming dataXXX is discrete, then the purpose of log-likehood (log likelihood) is equivalent to minimizing the following publicity:
L ( D ) = 1 N ∑ i = 1 N ( − logp θ ( X ( i ) ) ) L( D)=\frac{1}{N}\sum_{i=1}^{N}(-logp_\theta(X^{(i)}))L ( D )=N1i=1N(logpi(X(i)))

For most flow-based generative models, the process can be defined as:
z ∼ p θ ( z ) z\sim p_θ(z)zpi(z)
x = g θ ( z ) x=g_θ (z) x=gi(z)
z z z is the latent variable,p θ ( z ) p_θ (z)pi( z ) is a simple probability density, such as a spherical Gaussian:p θ ( z ) = N ( z ; 0 , I ) p_θ (z)=N(z;0,I)pi(z)=N(z;0,I)

function g θ g_θgiis reversible, also called bijection, giving a data xxx , the latent variable can be passedz = f θ ( x ) = g θ − 1 ( x ) z=f_θ (x)=g_θ^{−1} (x)z=fi(x)=gi1(x)

For simplicity, below we will omit the subscript θ θi

We assume the function fff consists of a series of transformations:f = f 1 ∘ f 2 ∘ ⋅ ⋅ ⋅ ∘ fnf=f_1 \circ f_2 \circ ··· \circ f_nf=f1f2⋅⋅⋅fn, so xxx sumzz_The relationship of z can be described as:
insert image description here

Such a sequence of reversible transformations can also be called a (regularized) flow,

Guess you like

Origin blog.csdn.net/qq_40243750/article/details/129451178