The depth of the residual shrinkage network :( e) experimental verification

  Experimental part The proposed two depths residual shrinkage network, i.e. "threshold value shared between a channel depth residual shrinkage network ( different Deep Residual Shrinkage Networks with Channel-shared Thresholds, referred DRSN-CS)", and "channel by channel residual shrinkage network depth threshold value ( deep shrinkage residual-Wise networks, with the thresholds Channel , abbreviated DRSN-CW) ", and the conventional convolutional neural network (convolutional neural networks, ConvNet) and depth residuals network (deep residual networks, ResNet ) were compared. The experimental data is a gearbox vibration signal in the eight kinds of health status, were added different levels of Gaussian noise, Laplacian noise and Pink noise.

  The experimental results at different levels of Gaussian noise (training left accuracy, the right is the test accuracy rate):

  The results in varying degrees of Laplacian noise (training left accuracy, the right is the test accuracy rate):

  The results in varying degrees of Pink noise (training left accuracy, the right is the test accuracy rate):

  You can see that in the stronger noise, that is, the SNR (Signal-to-Noise Ratio, SNR) for the -5dB when compared to ConvNet and ResNet enhance the effect is most obvious. When the noise is weak, DRSN-CS and accuracy DRSN-CW is also high, because DRSN-CS and DRSN-CW may be adaptively set threshold.

 

  First four elements:

  深度残差收缩网络:(一)背景知识 https://www.cnblogs.com/yc-9527/p/11598844.html

  深度残差收缩网络:(二)整体思路 https://www.cnblogs.com/yc-9527/p/11601322.html

  深度残差收缩网络:(三)网络结构 https://www.cnblogs.com/yc-9527/p/11603320.html

  深度残差收缩网络:(四)注意力机制下的阈值设置 https://www.cnblogs.com/yc-9527/p/11604082.html

  原文的链接:

  M. Zhao, S. Zhong, X. Fu, B. Tang, and M. Pecht, “Deep Residual Shrinkage Networks for Fault Diagnosis,” IEEE Transactions on Industrial Informatics, 2019, DOI: 10.1109/TII.2019.2943898

  https://ieeexplore.ieee.org/document/8850096

Guess you like

Origin www.cnblogs.com/yc-9527/p/11610073.html