Several automatic differentiation library python

Several brief introduction python automated derivation tool, tangent, autograd, sympy;
  learn a variety of machine, the depth of the frame are included in the learning automatic differentiation, so there are four kinds of differential: manual differential method, numerical differentiation, the symbol Differentiation, automatic differentiation , where each simple fly (hello world type) under the following description of several differential frame;

sympy powerful scientific computing library, using symbolic differentiation, a derivative by generating symbolic expressions; derivative obtained is not necessarily the most simple, when the function is more complex generated expression tree is very complex;

autograd automatic differentiation for the first basic symbol differential operator, and value into the intermediate results, it is applied to the entire function; FIG essentially automatic differential calculation, it is easy to do a lot of optimization widely used in various machine learning learning depth frame;

tangent of the source to the source (source-to-source) automatic differential frame, in the calculation of the function f to calculate the differential of the differential of his function by generating a new function f_grad, all automatic differential current frame are different are present; Since it is calculated by generating new function differential and therefore have a very engaged readability, adjustable resistance which is the official said automatic differentiation materially different from the current framework;

sympy derivation

 def grad():
     # 定义表达式的变量名称
     x, y = symbols('x y')
     # 定义表达式
     z = x**2 +y**2
     # 计算z关于y对应的偏导数
     return diff(z, y)

 func = grad()

The output of the derivative function expression z z '= 2 * y

 print(func) 

Y is equal to 6 is brought into the calculation result 12

 print(func.evalf(subs ={'y':3}))

Autograd partial derivative

 import autograd.numpy as np
 from autograd import grad

 #表达式 f(x,y)=x^2+3xy+y^2
 #df/dx = 2x+3y
 #df/dy = 3x+2y
 #x=1,y=2
 #df/dx=8
 #df/dy=7
 def fun(x, y):
   z=x**2+3*x*y+y**2
   return z

 fun_grad = grad(fun)
 fun_grad(2.,1.)

Output: 7.0

tangent derivation

 import tangent
 def fun(x, y):
   z=x**2+3*x*y+y**2
   return z

The default for the sake z is the partial derivative with respect to x

 dy_dx = tangent.grad(fun)

Output value of the deflector 8, z '= 2 * x, where x is any value passed is the same

 df(4, y=1)

Partial derivatives can be specified on a request by using the parameters wrt parameter, for the sake of the following partial derivatives with respect to z to y

 df = tangent.grad(funs, wrt=([1]))

Output value is 10, z '= 2 * y, where x is any value passed is the same

 df(x=0, y=5)

It says so much did not reflect the core tangent: the source-to-source (source-to-source)

When generating guide function parameter verbose = 1 is added, to see tangent function to calculate the derivative used for our generation, the default value is 0 so we did not feel tangent derivation with other automatic differentiation frame What's the difference;

 def df(x):
     z = x**2
     return z

 df = tangent.grad(df, verbose=1)
 df(x=2)

After the code is executed, we see we generated a tangent to the derivative of the function:

  def ddfdx(x, bz=1.0):
    z = x ** 2
    assert tangent.shapes_match(z, bz), 'Shape mismatch between return value (%s) and seed derivative (%s)' % (numpy.shape(z), numpy.shape(bz))
 # Grad of: z = x ** 2
  _bx = 2 * x * bz
  bx = _bx
  return bx

  ddfdx function is generated as a function, from which we can see the expression of the derivative function z z '= 2 * x, tangent is calculated by executing the function for the derivative;

  sympy automatic differentiation is just one of its powerful features, autograd also be seen from the name it was born for automatic differentiation, automatic differentiation method tangent fledgling end of 2017 Google was released relatively new, hair v0.1.8 version from 17 years after the hair has not seen version of the source code updates are not active enough; sympy, autograd more mature, tangent remains to be seen;

Guess you like

Origin www.cnblogs.com/softlin/p/11427390.html