Python opennsfw/opennsfw2 picture/video porn detection notes

nsfw (Not Suitable for Work) is directly translated as not suitable for viewing while working, so elegant.

nsfw effect, pay attention to the score at the bottom

 

The general process is to input pictures/videos and output a number between 0-1. Generally speaking, Scores < 0.2 is considered very safe, and Scores > 0.8 is very likely to be an xx picture.

The roop open source project uses 0.85

Compared with opennsfw, opennsfw2 uses TensorFlow for the model.

reference

GitHub - yahoo/open_nsfw: Not Suitable for Work (NSFW) classification using deep neural network Caffe models.

GitHub - bhky/opennsfw2: Keras Core implementation of the Yahoo Open-NSFW model

Guess you like

Origin blog.csdn.net/linzhiji/article/details/132295467