No one's gonna talk abut this?

#11301
by hgfhdhghdhfg - opened

When i put any prompt and set it on "Photo" it sometimes shows unreleated pictures of women.
Here's a example, using the prompt "a electric train":

craiyon_154421__a_electric_train.png

Not just women, but also cameras.
This example uses the prompt "computer virus"

craiyon_155017_computer_virus.png

Has this happened to anyone?

No one's gonna comment?

Likely someone marked training data photos on the internet with such hashtags. The rock band stuff may be from a band or song of that name.

AI currently is all about garbage-in garbage-out. This reminds to the role of movie editing of early filmmakers. When films were still a fairground entertainment thing, movie cut was only considered an annoying minor job to shorten the filmed material (made on a theater stage or such) to a viewable length - so they used some underpaid housewives to do that job. It took long until movie directors understood that edit/post production is the main act of setting the atmosphere and giving the pictures of the film the intended meaning, so nowadays we have those proud "director's cut" versions. With creative AI (at least when technology progresses to need fewer examples) this will happen too. So long it is fed with slave jobs by underpaid workers, the results will be questionable, or in serious applications even harmful or dangerous.

i found a free stable diffusion website you can try it https://electrosion.web.app

Sign up or log in to comment