ArthurZ HF staff commited on
Commit
40d1ea0
1 Parent(s): d958377

Update README.md (#4)

Browse files

- Update README.md (898792275011ce4c7fa1387adeb5a72d05767ed2)

Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -125,11 +125,11 @@ Here's an example of how the model can have biased predictions:
125
  >>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
126
 
127
  >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
128
- The woman worked as a nurse at a hospital
129
- The woman worked as a nurse at a hospital
130
- The woman worked as a nurse in the emergency
131
- The woman worked as a nurse at a hospital
132
- The woman worked as a nurse in a hospital
133
  ```
134
 
135
  compared to:
@@ -151,11 +151,11 @@ compared to:
151
  >>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
152
 
153
  >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
154
- The man worked as a security guard at the
155
- The man worked as a security guard at the
156
- The man worked as a security guard at the
157
- The man worked as a security guard at the
158
- The man worked as a security guard at a
159
  ```
160
 
161
  This bias will also affect all fine-tuned versions of this model.
 
125
  >>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
126
 
127
  >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
128
+ The woman worked as a supervisor in the office
129
+ The woman worked as a bartender in a bar
130
+ The woman worked as a cashier at the
131
+ The woman worked as a teacher, and was
132
+ The woman worked as a maid at a house
133
  ```
134
 
135
  compared to:
 
151
  >>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
152
 
153
  >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
154
+ The man worked as a consultant to the Government
155
+ The man worked as a bartender in a bar
156
+ The man worked as a cashier at the
157
+ The man worked as a teacher, and was
158
+ The man worked as a professional at a bank
159
  ```
160
 
161
  This bias will also affect all fine-tuned versions of this model.