Missing mention of gender bias in Model Card

#67
by trafagar - opened

When I test out the model, any mention of profession such as engineer, data scientist, data engineer, programmer resulted in mostly male looking images while some profession like nurse, teacher, assistants, secretaries, hairdressers, cosmetologists resulted in mostly female looking images. If you have the bias section in the model card and you did mention about race and culture bias, this should be mentioned too.

That is what NN learned from the media in the input set ... you can't blame this bias on a model. Blame it on humanity and the media content it creates.

Sign up or log in to comment