phueb commited on
Commit
1c0595e
1 Parent(s): f3f2b02
Files changed (1) hide show
  1. README.md +1 -8
README.md CHANGED
@@ -30,7 +30,7 @@ BabyBerta was developed for learning grammatical knowledge from child-directed i
30
  Its grammatical knowledge was evaluated using the [Zorro](https://github.com/phueb/Zorro) test suite.
31
  The best model achieves an overall accuracy of 80.3,
32
  comparable to RoBERTa-base, which achieves an overall accuracy of 82.6 on the latest version of Zorro (as of October, 2021).
33
- Both values differ slightly from those reported in the paper (Huebner et al., 2020).
34
  There are two reasons for this:
35
  1. Performance of RoBERTa-base is slightly larger because the authors previously lower-cased all words in Zorro before evaluation.
36
  Lower-casing of proper nouns is detrimental to RoBERTa-base because RoBERTa-base has likely been trained on proper nouns that are primarily title-cased.
@@ -58,10 +58,3 @@ More info can be found [here](https://github.com/phueb/BabyBERTa).
58
  [link-BabyBERTa-1]: https://huggingface.co/phueb/BabyBERTa-1
59
  [link-BabyBERTa-2]: https://huggingface.co/phueb/BabyBERTa-2
60
  [link-BabyBERTa-3]: https://huggingface.co/phueb/BabyBERTa-3
61
-
62
- ---
63
- language:
64
- - en
65
- tags:
66
- - acquisition
67
- ---
 
30
  Its grammatical knowledge was evaluated using the [Zorro](https://github.com/phueb/Zorro) test suite.
31
  The best model achieves an overall accuracy of 80.3,
32
  comparable to RoBERTa-base, which achieves an overall accuracy of 82.6 on the latest version of Zorro (as of October, 2021).
33
+ Both values differ slightly from those reported in the [CoNLL 2021 paper](https://aclanthology.org/2021.conll-1.49/).
34
  There are two reasons for this:
35
  1. Performance of RoBERTa-base is slightly larger because the authors previously lower-cased all words in Zorro before evaluation.
36
  Lower-casing of proper nouns is detrimental to RoBERTa-base because RoBERTa-base has likely been trained on proper nouns that are primarily title-cased.
 
58
  [link-BabyBERTa-1]: https://huggingface.co/phueb/BabyBERTa-1
59
  [link-BabyBERTa-2]: https://huggingface.co/phueb/BabyBERTa-2
60
  [link-BabyBERTa-3]: https://huggingface.co/phueb/BabyBERTa-3