## BabyBERTA BabyBERTA is a slightly-modified and much smaller RoBERTa model trained on 5M words of American-English child-directed input. It is intended for language acquisition research, on a single desktop with a single GPU - no high-performance computing infrastructure needed.