「Computer Vision」Note on Class-Balanced Loss

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/dgyuanshaofeng/article/details/86546643

QQ Group: 428014259
Tencent E-mail:[email protected]
http://blog.csdn.net/dgyuanshaofeng/article/details/86546643

2 相关工作

解决长尾失衡数据问题的策略有重采样re-sampling和敏感代价学习cost-sensitive learning。
cost-sensitive learning是统计学里面的importance sampling,简单来说,就是在代价函数中对少数据样本进行加权。加权可采样inverse class frequency和smoothed version of inverse square root of class frequency。
A side effect of assigning higher weights to hard examples is the focus on harmful samples (e.g., noisy data or mislabeled data).
在focal loss上加入class-balanced term。
解决失衡数据问题的其他方法有transferring the knowledge learned from major classes to minor classes和designing a better training objective via metric learning。

3 样本有效数(Effective Number of Samples)

Effective Number of Samples是论文[1]提出的东西。

3.1 数据采样过程是简单的Random Covering

[1] Class-Balanced Loss Based on Effective Number of Samples 2019 [paper]

猜你喜欢

转载自blog.csdn.net/dgyuanshaofeng/article/details/86546643