File size: 283 Bytes
6fa1048 |
1 2 3 4 |
## BabyBERTA
BabyBERTA is a slightly-modified and much smaller RoBERTa model trained on 5M words of American-English child-directed input.
It is intended for language acquisition research, on a single desktop with a single GPU - no high-performance computing infrastructure needed. |