Taught you how to build a GAN (generated against network) model

Taught you how to build a GAN (generated against network) model

This tutorial implemented Keras a simple GAN (generated against network) model based on this:. MnistGAN project , we will use Keras this python library.

The use of python packages:

  • keras: neural network model related
  • matplotlib: Paint
  • tensorflow: keras run foundation
  • tqdm: progress bar

Talk is cheap, show me the code.

1. Import packages:

import os
import numpy as np
import matplotlib.pyplot as plt
from tqdm import tqdm

from keras.layers import Input  # input layer  
from keras.models import Model, Sequential
from keras.layers.core import Dense, Dropout
from keras.layers.advanced_activations import LeakyReLU
from keras.datasets import mnist
from keras.optimizers import Adam
from keras import initializers

2. Define several variables:

# Let Keras know that we are using tensorflow as our backend engine
os.environ["KERAS_BACKEND"] = "tensorflow"

# To make sure that we can reproduce the experiment and get the same results by same random output
np.random.seed(10)

# The dimension of our random noise vector. this code is from: sonictl_at_cnblogs.net
random_dim = 100

3. Prepare the data:

Here we use handwritten digits (MNIST) data set. MNIST data set has been a "chew" of the data set, it will be a lot of tutorials "strike" has almost become a "model" tell us:

MNIST data sets from the National Institute of Standards and Technology, National Institute of Standards and Technology (NIST). Training set (training set) made from 250 different people handwritten Numbers, of which 50% are high school students, 50% of the population Census staff (the Census Bureau) of the test set (test set) is the same proportion of handwritten digital data.

MNIST data set available in http://yann.lecun.com/exdb/mnist/, which comprises four parts:

  • Training set images (containing 60,000 samples)
  • Training set labels (labels containing 60,000)
  • Test set images (10,000 samples)
  • Test set labels (labels containing 10,000)
    in MNIST each image data set consists of 28 X 28 pixels, each pixel is represented by a gray scale value. Labels containing the appropriate target variable, i.e. handwritten digits class label (an integer of 0-9).
    FIG:
    FIG dataset MNIST

Code:

def load_minst_data():
    # load the data
    (x_train, y_train), (x_test, y_test) = mnist.load_data()    # y is the set of labels, mnist.load_data() is part of Keras and allows you to easily import the MNIST dataset
    # normalize our inputs to be in the range[-1, 1]
    x_train = (x_train.astype(np.float32) - 127.5)/127.5
    # convert x_train with a shape of (60000, 28, 28) to (60000, 784) so we have
    # 784 columns per row
    x_train = x_train.reshape(60000, 784)
    return (x_train, y_train, x_test, y_test)

4. discriminator generator and

Now to build the producer and the discriminator, we use Adam optimizer to optimize these two networks. And generating a discriminator for the neural network, we use three hidden layers, and a Leaky Relu function as the activation function. There is also a skill, in order to improve the robustness of the discriminator on the data that it is not seen, in which added a DropOut layer

# You will use the Adam optimizer
def get_optimizer():
    return Adam(lr=0.0002, beta_1=0.5)

def get_generator(optimizer):
    generator = Sequential()
    generator.add(Dense(256, input_dim=random_dim, kernel_initializer=initializers.RandomNormal(stddev=0.02)))
    generator.add(LeakyReLU(0.2))

    generator.add(Dense(512))
    generator.add(LeakyReLU(0.2))

    generator.add(Dense(1024))
    generator.add(LeakyReLU(0.2))

    generator.add(Dense(784, activation='tanh'))
    generator.compile(loss='binary_crossentropy', optimizer=optimizer)
    return generator

def get_discriminator(optimizer):
    discriminator = Sequential()
    discriminator.add(Dense(1024, input_dim=784, kernel_initializer=initializers.RandomNormal(stddev=0.02)))
    discriminator.add(LeakyReLU(0.2))
    discriminator.add(Dropout(0.3))

    discriminator.add(Dense(512))
    discriminator.add(LeakyReLU(0.2))
    discriminator.add(Dropout(0.3))

    discriminator.add(Dense(256))
    discriminator.add(LeakyReLU(0.2))
    discriminator.add(Dropout(0.3))

    discriminator.add(Dense(1, activation='sigmoid'))
    discriminator.compile(loss='binary_crossentropy', optimizer=optimizer)
    return discriminator

To be continued. you can read more content in the link below:
Read more: https://www.datacamp.com/community/tutorials/generative-adversarial-networks

.
.
.
.
.
.

Guess you like

Origin www.cnblogs.com/sonictl/p/12323843.html