phueb commited on
Commit
f5e1df5
1 Parent(s): 6fa1048

add more info

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -1,4 +1,8 @@
1
  ## BabyBERTA
2
 
3
  BabyBERTA is a slightly-modified and much smaller RoBERTa model trained on 5M words of American-English child-directed input.
4
- It is intended for language acquisition research, on a single desktop with a single GPU - no high-performance computing infrastructure needed.
 
 
 
 
 
1
  ## BabyBERTA
2
 
3
  BabyBERTA is a slightly-modified and much smaller RoBERTa model trained on 5M words of American-English child-directed input.
4
+ It is intended for language acquisition research, on a single desktop with a single GPU - no high-performance computing infrastructure needed.
5
+
6
+ This model was trained by [Philip Huebner](https://philhuebner.com), currently at the [UIUC Language and Learning Lab](http://www.learninglanguagelab.org).
7
+
8
+ More info can be found [here](https://github.com/phueb/BabyBERTa).