ND4J automatic differentiation

I. Introduction

    ND4J has supported automatic differentiation since beta2, but until the beta4 version, automatic differentiation only supports CPU, and the GPU version will be implemented in subsequent versions.

    In this blog, we will use ND4J to build a function, use the ND4J SameDiff build function to find the function value and find the partial differential value of each variable of the function.

2. Build the function

    Building functions and manually finding partial derivatives separately

    

    Given a point (2,3) to manually find the function value and partial derivative, the calculation is as follows:

    f=2+3*4+3=17, the partial derivative of f to x: 1+2*2*3=13, the partial derivative of f to y: 4+1=5

Third, through ND4J automatic differentiation to find

    full code

package org.nd4j.samediff;

import org.nd4j.autodiff.samediff.SDVariable;
import org.nd4j.autodiff.samediff.SameDiff;
import org.nd4j.linalg.factory.Nd4j;

/**
 * 
 * x+y*x2+y
 *
 */
public class Function {

	public static void main(String[] args) {
		//构建SameDiff实例
		SameDiff sd=SameDiff.create();
		//创建变量x、y
		SDVariable x= sd.var("x");
		SDVariable y=sd.var("y");
		
		//定义函数
		SDVariable f=x.add(y.mul(sd.math().pow(x, 2)));
		f.add("addY",y);
		
		//给变量x、y绑定具体值
		x.setArray(Nd4j.create(new double[]{2}));
		y.setArray(Nd4j.create(new double[]{3}));
		//前向计算函数的值
		System.out.println(sd.exec(null, "addY").get("addY"));
		//后向计算求梯度
		sd.execBackwards(null);
		//打印x在(2,3)处的导数
		System.out.println(sd.getGradForVariable("x").getArr());
		//x.getGradient().getArr()和sd.getGradForVariable("x").getArr()等效
		System.out.println(x.getGradient().getArr());
		//打印y在(2,3)处的导数
		System.out.println(sd.getGradForVariable("y").getArr());
	}
}

    4. Operation results

o.n.l.f.Nd4jBackend - Loaded [CpuBackend] backend
o.n.n.NativeOpsHolder - Number of threads used for NativeOps: 4
o.n.n.Nd4jBlas - Number of threads used for BLAS: 4
o.n.l.a.o.e.DefaultOpExecutioner - Backend used: [CPU]; OS: [Windows 10]
o.n.l.a.o.e.DefaultOpExecutioner - Cores: [8]; Memory: [3.2GB];
o.n.l.a.o.e.DefaultOpExecutioner - Blas vendor: [MKL]
17.0000
o.n.a.s.SameDiff - Inferring output "addY" as loss variable as none were previously set. Use SameDiff.setLossVariables() to override
13.0000
13.0000
5.0000

    The results are 17, 13, 5 and the results obtained by hand are exactly the same.

    Automatic differentiation shields many details of deeplearning in the differentiation process, especially matrix derivation, matrix norm derivation, etc., which are very troublesome. With automatic differentiation, various network structures can be easily realized.

 

Happiness comes from sharing.

   This blog is original by the author, please indicate the source for reprinting

{{o.name}}
{{m.name}}

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=324076111&siteId=291194637