Deep learning - generated against network GAN notes

Generated against the network (GAN) consists of two important parts:

Generator G (Generator ): generates data (image in most cases) through the machine, the purpose of the "fool" discriminator

Discriminator D (Discriminator ): judging this image is real or machine generated, the purpose is to identify the builder to do "fake data"

Generated during operation against the network:

The first stage: fixing discriminator D, the training generator G

Initialization discriminator D, so that a generator G generating constantly "false data", then go to the discriminator D determines.

A start, the generator G is also very weak, it can easily be judged to be false.

But with the continuous training, the generator G and improve skills, and ultimately fool the discriminator D.

In this case, D is determined basically guessing state, it determines whether the probability of a false data is 50%.

The second stage: fixing generator G, D training classifiers

When passed the first stage, we continue to train generator G does not make sense. The time necessary to fix the generator G, and then start training discriminator D.

Discriminator D through continuous training to improve their ability to identify, and ultimately he can accurately determine all the false pictures.

By this time, the generator G has been unable to fool the discriminator D.

Cycle stage and a stage two

Through continuous circulation, and the capacity of the generator G discriminator D are growing.

Eventually we got a very good effect generator G, we can use it to generate what we want pictures.

Description of works by GAN map and formula

Figure, the black dotted line is the real data, a green dotted line is to generate data, the action generator G, the blue dot line is determined formula, D is the effect determination:

  

Wherein P Data value (x) can be regarded as 1 (true data distribution), P G (x) -> (green dotted line) is considered in generating the data distribution, which is approaching in a cycle, so when determining can not when the generator determines D (x) value to the above equation * 0.5.

Relationship between x and z is the real data and the generated mapping relationship data.

In the Generative Adversarial Nets original author gives a loss of function formula:

We can understand this formula:

  Classifiers to determine the maximum possible is to generate data, the gap generator generates data and real data as small as possible. It is enhanced in the process cycle while discrimination is discrimination and increase the production capacity of the generator closer to the real data capacity.

Discriminant model D:

Generator Model G:

  D 1 (X) is the actual data discrimination value, it is desirably 1, taking the word value of the number becomes 0 a, D 2 (G (Z)) to generate a data discrimination value, discriminator desired judged that it is generated data, so expect it to zero. While generating a model, it is desirable that it does not come out discriminator discriminating data is generated, so it is desirable for the generator D 2 (G (Z)) is 1.

The advantages and disadvantages of generating confrontation network

advantage:

  1. Label does not require extensive loss of data determined from D
  2. It generates a large amount of data used to train, close to the unsupervised learning
  3. Can and depth neural network

Disadvantages:

  1. Generating data directly without derivation
  2. Generator, discriminator need to meet common training difficult
  3. Training prone to failure 

The following excerpt from a blog post http://xiaoqiang.me/?p=4592 , want to be able to facilitate their access to work in the future

If you can "interested GANs algorithm GANs zoo to see almost all of the algorithms" in. We picked up 10 more representative algorithms from a number of algorithms for everyone,

algorithm

paper

Code

BOTH

Papers address

Code address

DCGAN

Papers address

Code address

CGAN

Papers address

Code address

CycleGAN

Papers address

Code address

CoGAN

Papers address

Code address

ProGAN

Papers address

Code address

Wgan

Papers address

Code address

SAGAN

Papers address

Code address

BigGAN

Papers address

Code address

Guess you like

Origin www.cnblogs.com/yang901112/p/11926178.html