Supervised pre-trainning supervised pre-training

If we have a classification task, the database is small, then still need to avoid over-fitting depth model of the problem by pre-training, pre-training is only through a large database (such as imagenet), supervised by training to complete. This fine-tuning the supervision of pre-training plus a small database schema called Transfer Learning .

R-CNN is under the supervision of a large sample of pre-training + small sample to fine-tune the way to solve the small sample training hard and even over-fitting problems.

Speed : classical object detection algorithm using a sliding window method of sequentially determining all possible regions. R-CNN more series may be extracted in advance object candidate region , only after extraction features on these candidate regions, the determination.

Training set : classical artificial target detection algorithm extracts a feature set (Haar, HOG) in area. R-CNN uses two databases:                         

A greater recognition library (ImageNet ILSVC 2012): calibration of each picture in the category of objects. Ten million images, 1000 class.
A smaller detection library (PASCAL VOC 2007): calibration of each picture, category and location of the object. Ten thousand images, 20 category.
Using the identification pre-trained library, then the library is detected by tuning parameters. The last evaluation on detection library.

 

Guess you like

Origin www.cnblogs.com/pacino12134/p/11404793.html