ND4J automatic differentiation

I. Preface

ND4J supports automatic differentiation from beta2, but until the beta 4 version, automatic differentiation only supports CPU, and GPU version will be implemented in later versions.

In this blog, we will use ND4J to build a function, and use ND4J SameDiff to build a function to find the function value and the partial differential score of each variable of the function.

II. Build function

Construct function and calculate partial derivative manually

    

Give a point (2,3) to calculate the function value and partial derivative manually, and the calculation is as follows:

f=2+3*4+3=17, f to x: 1 + 2 * 2 * 3 = 13, f to y: 4 + 1 = 5

3. Find it by ND4J automatic differentiation

Complete code

package org.nd4j.samediff;

import org.nd4j.autodiff.samediff.SDVariable;
import org.nd4j.autodiff.samediff.SameDiff;
import org.nd4j.linalg.factory.Nd4j;

/**
 * 
 * x+y*x2+y
 *
 */
public class Function {

	public static void main(String[] args) {
		//Build SameDiff instance
		SameDiff sd=SameDiff.create();
		//Create variables x, y
		SDVariable x= sd.var("x");
		SDVariable y=sd.var("y");
		
		//Defined function
		SDVariable f=x.add(y.mul(sd.math().pow(x, 2)));
		f.add("addY",y);
		
		//Bind specific values to variables x and y
		x.setArray(Nd4j.create(new double[]{2}));
		y.setArray(Nd4j.create(new double[]{3}));
		//Value of forward calculation function
		System.out.println(sd.exec(null, "addY").get("addY"));
		//Backward calculation for gradient
		sd.execBackwards(null);
		//Print the derivative of x at (2,3)
		System.out.println(sd.getGradForVariable("x").getArr());
		//x.getGradient().getArr() and sd.getGradForVariable("x").getArr() are equivalent
		System.out.println(x.getGradient().getArr());
		//Print the derivative of y at (2,3)
		System.out.println(sd.getGradForVariable("y").getArr());
	}
}

IV. operation results

o.n.l.f.Nd4jBackend - Loaded [CpuBackend] backend
o.n.n.NativeOpsHolder - Number of threads used for NativeOps: 4
o.n.n.Nd4jBlas - Number of threads used for BLAS: 4
o.n.l.a.o.e.DefaultOpExecutioner - Backend used: [CPU]; OS: [Windows 10]
o.n.l.a.o.e.DefaultOpExecutioner - Cores: [8]; Memory: [3.2GB];
o.n.l.a.o.e.DefaultOpExecutioner - Blas vendor: [MKL]
17.0000
o.n.a.s.SameDiff - Inferring output "addY" as loss variable as none were previously set. Use SameDiff.setLossVariables() to override
13.0000
13.0000
5.0000

The results of 17, 13, 5 are exactly the same as the results of manual calculation.

Automatic differentiation shields many details of deep learning in the process of differentiation, especially matrix derivation, matrix norm derivation and so on, which is very troublesome. With automatic differentiation, various network structures can be easily realized.

 

Happiness comes from sharing.

This blog is created by the author. Please indicate the source of reprint

Keywords: Windows network

Added by MiCR0 on Mon, 04 Nov 2019 23:14:23 +0200