版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/u013608336/article/details/82792871
prepare
1.加载模型,利用get_layer 来获取某一层的参数 (为了attention map)
将attention map,当做二值图片?resize,再与原图相乘
2.每次输入一张图片,利用model.predict获取某一层的feature map的输出
image attention visualization
参考
博客1
keras attention
How to Visualize Your Recurrent Neural Network with Attention in Keras
知乎,grad cam
grad-CAM-pycaffe
grad cam tensorflow