Update README.md
#2
by
Loration
- opened
README.md
CHANGED
@@ -61,9 +61,9 @@ When using AlephBertGimmel, please reference:
|
|
61 |
|
62 |
```bibtex
|
63 |
|
64 |
-
@misc{
|
65 |
title={Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All},
|
66 |
-
author={Eylon
|
67 |
year={2022},
|
68 |
eprint={2211.15199},
|
69 |
archivePrefix={arXiv},
|
|
|
61 |
|
62 |
```bibtex
|
63 |
|
64 |
+
@misc{gueta2022large,
|
65 |
title={Large Pre-Trained Models with Extra-Large Vocabularies: A Contrastive Analysis of Hebrew BERT Models and a New One to Outperform Them All},
|
66 |
+
author={Eylon Gueta and Avi Shmidman and Shaltiel Shmidman and Cheyn Shmuel Shmidman and Joshua Guedalia and Moshe Koppel and Dan Bareket and Amit Seker and Reut Tsarfaty},
|
67 |
year={2022},
|
68 |
eprint={2211.15199},
|
69 |
archivePrefix={arXiv},
|