A method of visualizing tensorflow neural network model structure and training process with TensorBoard

  This article introduces the method of visualizing the neural network model built by the library based on the TensorBoard tool , and visualizing the changes in the loss function ( Loss ) and accuracy indicators ( Metric ) during the training process .tensorflow

  In the previous two articles, deep learning regression and classification code based on Python TensorFlow Estimator - DNNRegressor (https://blog.csdn.net/zhebushibiaoshifu/article/details/114001720) and deep learning regression code based on Python TensorFlow Keras - —Keras.Sequential Deep Neural Network (https://blog.csdn.net/zhebushibiaoshifu/article/details/114016531), we introduced the specific ideas and code implementation of using the library in Python to realize machine learning and deep learning ; In addition, we have also introduced the method of neural network model visualization , such as the simple method of visual drawing of neural network model structure framework based on Python (https://blog.csdn.net/zhebushibiaoshifu/article/details/116212113) and neural network model Online and software drawing methods for structural framework visualization (https://blog.csdn.net/zhebushibiaoshifu/article/details/116723916), etc. However, TensorBoard based on the library has not been introducedtensorflowtensorflowTools, a method for visualizing the neural network model and the changes in various parameters during its training process; this article will introduce it in detail.

  TensorBoard is a visualization tool provided by TensorFlow that can help users better understand and debug TensorFlow models. It provides various charts and panels that can display information such as the training process of the model, performance indicators, network structure, and data distribution.

  First, in order to use TensorBoard for visualization, you need to add TensorBoard 's callback function to the code . During model training, the performance and other relevant information of the model will be recorded at the end of each epoch and written to the TensorBoard log directory. The following is the sample code to add the TensorBoard callback function.

from tensorflow.keras.callbacks import TensorBoard

# 创建TensorBoard回调函数并指定日志目录
tensorboard_callback = TensorBoard(log_dir = "E:/01_Reflectivity/03_Code")

# 在fit()函数中将TensorBoard回调函数添加到回调列表中
model.fit(train_data, train_targets, epochs=50, batch_size=64, validation_data=(test_data, test_targets), callbacks=[tensorboard_callback])

  Of course, it is obvious that the above code is just a sample code for adding a TensorBoard callback function, not tensorflowall the code for the library to implement the neural network model; if you need all the code, you can refer to the deep learning regression code based on Python TensorFlow Keras - keras.Sequential Deep neural network (https://blog.csdn.net/zhebushibiaoshifu/article/details/114016531) This article will not be repeated here.

  In the above code, log_dirit is the directory where everyone stores logs, and you can modify it yourself.

  Then, start the TensorBoard server. Enter the following command in the terminal to start the TensorBoard server.

tensorboard --logdir=E:\01_Reflectivity\03_Code

  Among them, the last path is the path for storing logs mentioned earlier. Then, run the above code, as shown in the figure below.

insert image description here

Next, view TensorBoard   in your browser . Access in the browser http://localhost:6006/, you can see the main interface of TensorBoard . On the interface, you can view the model's architecture, performance metrics, activation histogram, and more. As shown below.

insert image description here

  The main panels in TensorBoard include the following:

Scalars: Displays scalar indicators during the training process, such as training error, validation error, learning rate, etc.

Graphs: Display the calculation graph, you can see the input and output of each layer, as well as the dimension and value of the parameters.

Distributions: Display data distribution, you can view the distribution of weights, gradients, activation values, etc., which is helpful for diagnosing problems such as overfitting or underfitting.

Histograms: Histograms showing data distribution, similar to Distributions, but more detailed.

Images: Display image data, you can view the input image, the output of the convolutional layer, etc.

Projector: Displays the embedding of high-dimensional data, and can visualize the data for dimension reduction.

Text: display text data, you can view the results of text classification, generation and other tasks.

  In short, TensorBoard provides rich visualization functions that can help users better understand and optimize models.

  Among them, we are often more interested in the Scalars and Graphs panels, so we will mainly talk about these two parts here.

  When we train the model, we may want to monitor the training of the model in real time, such as changes in the loss function, changes in accuracy, and so on. TensorBoard provides the Scalars interface, which can easily visualize these indicators; as shown in the figure below.

insert image description here

  In code, we can use tf.summary.scalarfunctions to write metrics to TensorBoard log files.

  Secondly, introduce the Graphs interface. In TensorFlow , Graphs (graphs) are calculation graphs representing neural networks, including connections between layers, parameters of each layer, activation functions, and so on. The Graphs interface can be used to visualize the structure of the TensorFlow calculation graph, so as to better understand the calculation process of the neural network. As shown below.

insert image description here

  In the Graphs interface, you can see the name and shape of each layer in the neural network, as well as the connections between layers. By clicking on each layer, you can view the details of that layer, including the layer's parameters, activation functions, and more. In addition, the Graphs interface can also display the name of each variable and operation, as well as their position in the calculation graph.

  Through the Graphs interface, you can better understand the calculation process of the neural network, discover possible problems in the neural network, and optimize the structure of the neural network. At the same time, the Graphs interface can also be used in conjunction with other TensorBoard interfaces (such as Scalars , Histograms , etc.) to further improve the visualization and debugging capabilities of the neural network.

  In addition, TensorBoard also has very huge functions, you can refer to its official help documentation for details.

Welcome to pay attention: Crazy learning GIS

Guess you like

Origin blog.csdn.net/zhebushibiaoshifu/article/details/129414131