TensorFlow Basics (four) - tf.nn.relu ()

Set the number tf.nn.relu () function is a number greater than 0 is kept constant, 0 is less than 0, the function shown in Fig.

ReLU function is one of the neural network activation function commonly used.


 


ReLU below is an example:

import tensorflow as tf
 
v = tf.constant([-3, 5, 6, -6,9])
 
sess = tf.Session()
print('v的原始值为:', end='')
print(sess.run(v))
 
print('v的ReLu后的值为:', end='')
print(sess.run(tf.nn.relu(v)))
 
sess.close()


The output is:

v的原始值为:[-3  5  6 -6  9]
v的ReLu后的值为:[0 5 6 0 9]

 

Published 352 original articles · won praise 115 · views 130 000 +

Guess you like

Origin blog.csdn.net/Aidam_Bo/article/details/103189899