Theano basis

How to install Theano

 By anaconda installation

conda install Theano

 

What should pay attention to the installation process?

After the recommended python environment is less than 3.6, the installation will be able to use the function

If by conda create installation fellow, please refer me to write this page https://www.cnblogs.com/hardykay/p/12611460.html

What is the symbolic variables

1, built-in variable types

Theano currently supports seven built-in variables: scalar (scalar), vector (Vector), row (row vector), col (column vector), matrix (matrix), tensor3, tensor4 and so on. Of course, these variables uniform amount called Zhang, 0-order tensor called scalar, vector called first-order, second order called the matrix, and so on.

Let's look at a simple equation: y = 2x + b

In the above equations y, 2, x, b is the symbol variable that can be expressed as a general expression: y = w * x + b; where w is a weight, d is the offset, x is the independent variable, y dependent variable. The following is a case code.

import theano
from theano import tensor as T

# 初始化张量
x = T.scalar(name='input', dtype='float32')
w = T.scalar(name='weight', dtype='float32')
b = T.scalar(name='bias', dtype='float32')
z = w*x + b
# 编译程序
net_input = theano.function(inputs=[w, x, b], outputs=z)
#Executing program 
Print ( ' net_input:%. 2F ' % net_input (2.0, 3.0, 0.5))

 

2, Types Custom Variables

Types Custom Variables theano achieved by TensorType

import theano
from theano import tensor as T
mytype = T.TensorType('float64', broadcastable=(), name=None, sparse_grad=False)

How to design Symbolic Computation FIG.

Symbolic Computation mathematical model is the data processing, also known as Formula One.

 

import theano
import numpy as np
import theano.tensor as T


x = T.dmatrices('x')
y = T.dmatrices('y')
z = x + y

 

 

Function Function

Format defined functions

def function(inputs, outputs=None, mode=None, updates=None, givens=None,no_default_updates=False, accept_inplace=False, name=None,rebuild_strict=True, allow_input_downcast=None, profile=None,on_unused_input=None)

 

Common parameters: inputs (arguments), outputs (dependent variable), updates (neural network shared variable parameter updates), givens

And more independent variables, multiple dependent variables

Import Theano
 Import numpy AS NP
 Import theano.tensor AS T 

x, y = theano.tensor.fscalars ( ' X ' , ' Y ' ) 
Z1 = X + Y 
Z2 = X * Y 

# define x, y is the independent variable, z1 , z2 is the dependent variable 
F = theano.function ([X, Y], [Z1, z2]) 

Print (F (2,. 3))

 

Automatic derivation

Import Theano
 Import numpy AS NP
 Import theano.tensor T AS 

# define a variable of type float X 
X = theano.tensor.fscalar ( ' X ' )
 # define the variable Y 
Y =. 1 / (. 1 + theano.tensor.exp (- X))
 # partial derivative 
DX = theano.grad (Y, X)
 # define the function f, input x, output 
F = theano.function ([X], DX) 

Print (F (. 3))

 

Update shared variables (use this parameter updates)

Import Theano 

# define a shared variable w, the initial value. 1 
W = theano.shared (. 1 ) 
X = theano.tensor.iscalar ( ' X ' )
 # -defined function argument is x, the dependent variable is w, when executing the function after updating the parameter X + W = W 
F = theano.function ([X], W, updates = [[W, W + X]])
 Print (F (. 3 ))
 Print (w.get_value ())

 

Logistic regression -> updates usage

Import Theano
 Import numpy AS NP
 Import theano.tensor T AS 
RNG = np.random 

# For testing, self-generated 10 samples, each sample is a 3-dimensional vector, and then used for training 
N = 10 
feats = 3 
D = (RNG. randn (N, feats) .astype (np.float32), rng.randint (size = N, 0 = Low, High = 2 ) .astype (np.float32)) 

# declared argument x, and each sample corresponding label y (training labels) 
X = t.matrix ( ' X ' ) 
Y = t.vector ( ' Y ' ) 

# random initialization parameters w, b = 0, for the shared variable 
w = theano.shared (rng.randn (feats ), name = ' W ' ) 
BTheano.shared = (. 0, name = ' B ' ) 

# constructor cost function 
P_1 =. 1 / (. 1 + t.exp (-t.dot (X, W) - B))
 # S activation function 
xent = -y t.log * (P_1) - (. 1 - Y) * t.log (. 1 - P_1)
 # cross entropy cost function 
cost xent.mean = () + 0.01 * (2 ** W ) .sum ()
 # cost function + L2 average value of the regularization term to prevent over-fitting, wherein the attenuation coefficient of the weight 0.01 
GW, GB = t.grad (cost, [W, B])
 # Determined parameters partial derivative of cost function 
Prediction P_1 => 0.5 # greater than the predicted 0.5 1, otherwise 0 
Train = theano.function (Inputs = [X, Y], Outputs = [prediction, xent], Updates = ((W, W - 0.1 * GW), (B, B - 0.1 * GB))) 
Predict = theano.function (Inputs = [X], = Outputs
prediction)

# 训练
training_steps = 1000
for i in range(training_steps):
    pred, err = train(D[0], D[1])
    print(err.mean())

 

 

Conditions circulation

 Ifelse and switch can be used to represent the judgment in Theano statement

switch

Format: switch (cond, ift, iff)

If the condition of the switch performs both ift also perform iff

ifelse

格式:if cond then ift else iff

Ift or perform only iff

 

Magical shared variables

 . . . . To be continued

Guess you like

Origin www.cnblogs.com/hardykay/p/12613354.html