"a woman" generates "unsafe content" 50%, so it's blocked

#158
by brambor - opened

prompt: a woman, note the unsafe content
image.png

But the following prompts don't generate unsafe content:

a girl
image.png

grandma
image.png

I was looking at the list of sites crawled for some datasets. Lots of online shopping sites, magazines, etc. Places where the images came with detailed text descriptions. But also Pornhub. I guess if you want to train an AI to generate human bodies...

I'm not saying that the dataset used here includes images from Pornhub, but it might well include some sites that are not known for their safe-for-work content, if you catch my drift. You probably won't find too many kids (unless it's dark web, and I assume these sorts of web crawlers would be blocked by there) and grandmas on such sites.

I get Unsafe Content messages for words like woman, figure, and even person. Or sometimes, stupidly, if I want to render in the style of an artist who painted a lot of nudes. Paul Gaugin is a no-no, for instance, even though I like the way he painted landscapes and use him for that only.

I guess the take-home message here is if you don't want your AI to generate naked people, you gotta curate your own dataset, or your users will have to cope with a fair amount of blocked content. It's free for us, so I just deal, usually by briefly describing what the woman is wearing. Doesn't always work, but it helps.

Sign up or log in to comment