Tanh function usage tutorial and code implementation

The Tanh function (hyperbolic tangent function) is a commonly used activation function that maps an input value to a continuous output in the range -1 to 1. The formula of the Tanh function is as follows:

scss

f(x) = (exp(x) - exp(-x)) / (exp(x) + exp(-x))

 

The following is a Python sample code using the Tanh function:

python

import numpy as np

 

def tanh(x):

    return np.tanh(x)

 

# Example with a single value

x = 2

result = tanh(x)

print(result) # output: 0.9640275800758169

 

# Example using NumPy arrays

x_array = np.array([-2, -1, 0, 1, 2])

result_array = tanh(x_array)

print(result_array) # Output: [-0.96402758 -0.76159416 0. 0.76159416 0.96402758]

In the above example, we first defined a tanh function that takes an input value x and returns the computed result. Then we used a single value and a NumPy array as an example, calculated the corresponding tanh function value, and printed the output.

The Tanh function is widely used as an activation function in machine learning and deep learning, similar to the Sigmoid function, but with a larger output range and a steeper slope. It is able to map input values ​​to the range -1 to 1 and is more sensitive to changes in input values.

It should be noted that the Tanh function may also have the problem of gradient disappearance, especially when the input value is very large or very small. In deep learning, in order to solve this problem, other activation functions are often used, such as ReLU (Rectified Linear Unit).

Guess you like

Origin blog.csdn.net/m0_73291751/article/details/131792704