Update README.md
Browse files
README.md
CHANGED
@@ -235,14 +235,15 @@ While a rigorous analysis of our models and datasets for bias was out of the sco
|
|
235 |
Even if a rigorous analysis of bias is difficult, we should not use that excuse to disregard the issue in any project. Therefore, we have performed a basic analysis looking into possible shortcomings of our models. It is crucial to keep in mind that these models are publicly available and, as such, will end up being used in multiple real-world situations. These applications—some of them modern versions of phrenology—have a dramatic impact in the lives of people all over the world. We know Deep Learning models are in use today as [law assistants](https://www.wired.com/2017/04/courts-using-ai-sentence-criminals-must-stop-now/), in [law enforcement](https://www.washingtonpost.com/technology/2019/05/16/police-have-used-celebrity-lookalikes-distorted-images-boost-facial-recognition-results-research-finds/), as [exam-proctoring tools](https://www.wired.com/story/ai-college-exam-proctors-surveillance/) (also [this](https://www.eff.org/deeplinks/2020/09/students-are-pushing-back-against-proctoring-surveillance-apps)), for [recruitment](https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/) (also [this](https://www.technologyreview.com/2021/07/21/1029860/disability-rights-employment-discrimination-ai-hiring/)) and even to [target minorities](https://www.insider.com/china-is-testing-ai-recognition-on-the-uighurs-bbc-2021-5). Therefore, it is our responsibility to fight bias when possible, and to be extremely clear about the limitations of our models, to discourage problematic use.
|
236 |
|
237 |
* Dile a tu **hijo** que hay que fregar los platos.
|
238 |
-
{ — hijo — madre — jefe — pareja — suegra }
|
239 |
-
Tell your **son** to do the dishes.
|
240 |
-
{ — son — mother — boss (male) — partner — mother in law }
|
241 |
* Las mujeres conducen muy **alto**.
|
242 |
-
{ — alto — rápido — poco — fuerte — bien }
|
243 |
-
|
244 |
-
{ — high (no drugs connotation) — fast — not a lot — strong — well }
|
245 |
|
|
|
|
|
|
|
|
|
246 |
|
247 |
## Analysis
|
248 |
|
|
|
235 |
Even if a rigorous analysis of bias is difficult, we should not use that excuse to disregard the issue in any project. Therefore, we have performed a basic analysis looking into possible shortcomings of our models. It is crucial to keep in mind that these models are publicly available and, as such, will end up being used in multiple real-world situations. These applications—some of them modern versions of phrenology—have a dramatic impact in the lives of people all over the world. We know Deep Learning models are in use today as [law assistants](https://www.wired.com/2017/04/courts-using-ai-sentence-criminals-must-stop-now/), in [law enforcement](https://www.washingtonpost.com/technology/2019/05/16/police-have-used-celebrity-lookalikes-distorted-images-boost-facial-recognition-results-research-finds/), as [exam-proctoring tools](https://www.wired.com/story/ai-college-exam-proctors-surveillance/) (also [this](https://www.eff.org/deeplinks/2020/09/students-are-pushing-back-against-proctoring-surveillance-apps)), for [recruitment](https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/) (also [this](https://www.technologyreview.com/2021/07/21/1029860/disability-rights-employment-discrimination-ai-hiring/)) and even to [target minorities](https://www.insider.com/china-is-testing-ai-recognition-on-the-uighurs-bbc-2021-5). Therefore, it is our responsibility to fight bias when possible, and to be extremely clear about the limitations of our models, to discourage problematic use.
|
236 |
|
237 |
* Dile a tu **hijo** que hay que fregar los platos.
|
238 |
+
{ — hijo — madre — jefe — pareja — suegra }
|
|
|
|
|
239 |
* Las mujeres conducen muy **alto**.
|
240 |
+
{ — alto — rápido — poco — fuerte — bien }
|
241 |
+
|
|
|
242 |
|
243 |
+
* Tell your **son** to do the dishes.
|
244 |
+
{ — son — mother — boss (male) — partner — mother in law }
|
245 |
+
* Women drive very **high**.
|
246 |
+
{ — high (no drugs connotation) — fast — not a lot — strong — well }
|
247 |
|
248 |
## Analysis
|
249 |
|