Model mislabels black and white images as SFW.

#7
by justbanana - opened

Amongst ~400 pictures, with equal amount of SFW and NSFW, around 10 images were mislabeled, 6 being not-blatant NSFW and 4 being blatant NSFW, but in B&W. Anyone else found more weaknesses?

That seems well within the expected accuracy (0.965). I recommend adjusting the NSFW cutoff point according to your use case as it can perform better than the 0.5 default.

Sign up or log in to comment