Commit
•
49c9969
1
Parent(s):
dada2f5
Update README.md
Browse files
README.md
CHANGED
@@ -84,14 +84,14 @@ average | ar | bg | de | el | en | es | fr | hi | ru | sw | th | tr | ur | vu |
|
|
84 |
---------|----------|---------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------
|
85 |
0.808 | 0.802 | 0.829 | 0.825 | 0.826 | 0.883 | 0.845 | 0.834 | 0.771 | 0.813 | 0.748 | 0.793 | 0.807 | 0.740 | 0.795 | 0.8116
|
86 |
|
87 |
-
|
88 |
## Limitations and bias
|
89 |
Please consult the original DeBERTa-V3 paper and literature on different NLI datasets for potential biases.
|
90 |
-
### BibTeX entry and citation info
|
91 |
-
If you want to cite this model, please cite the original DeBERTa paper, the respective NLI datasets and include a link to this model on the Hugging Face hub.
|
92 |
|
93 |
-
|
|
|
|
|
|
|
94 |
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
|
95 |
|
96 |
-
|
97 |
-
Note that DeBERTa-v3 was released
|
|
|
84 |
---------|----------|---------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------
|
85 |
0.808 | 0.802 | 0.829 | 0.825 | 0.826 | 0.883 | 0.845 | 0.834 | 0.771 | 0.813 | 0.748 | 0.793 | 0.807 | 0.740 | 0.795 | 0.8116
|
86 |
|
|
|
87 |
## Limitations and bias
|
88 |
Please consult the original DeBERTa-V3 paper and literature on different NLI datasets for potential biases.
|
|
|
|
|
89 |
|
90 |
+
## BibTeX entry and citation info
|
91 |
+
If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k.
|
92 |
+
|
93 |
+
## Ideas for cooperation or questions?
|
94 |
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
|
95 |
|
96 |
+
## Debugging and issues
|
97 |
+
Note that DeBERTa-v3 was released in late 2021 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers==4.13 or higher might solve some issues. Note that mDeBERTa currently does not support FP16, see here: https://github.com/microsoft/DeBERTa/issues/77
|