Revisiting RCNN: On Awakening the Classification Power of Faster RCNN解读

2018ECCV文章,对于faster rcnn 的分类能力的思考:

most hard false positives result fromclassification instead of localization.

文中说,大部分的FP是分类错误导致的而不是回归(这个观点我持保留意见)。

We conjecture that:

(1) Shared feature representation is not optimal due to the mismatched goals of fea-
ture learning for classification and localization;

(2) multi-task learning helps, yet optimization of the multi-task loss may result in sub-optimal for individual tasks;

(3) large receptive field for different scales leads to redundant context information for small objects.

总结为三点思考:

1.分类和回归本身两个任务的不同,然后目前的结构是两者共享前面的特征,这种方式不是最优的

2.多任务联合训练的loss可能导致单个任务陷入局部最优

3.不同尺度上的大感受野可能会导致在检测小物体时有冗余的特征

猜你喜欢

转载自blog.csdn.net/qq_33547191/article/details/96025442