False positives

#6
by Dolfik - opened

Hello. Thanks for this great model. I use this model to validate images that users generate, and I found that the model is prone to false positives when there are red tones in the colors in the image.

Here is some examples:

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

Looks like the training is causing skin-tone (color) to be oversensitive to false positives.

Falcons.ai org
edited May 13

Yes flesh tones trigger false positives for obvious reasons. Lol

It’s obvious that this data and model are very brittle. As soon as it looks like a woman it’s unsafe, but men are barely considered unsafe

Sign up or log in to comment