MoritzLaurer HF staff commited on
Commit
1b7b1b2
1 Parent(s): 5cb76dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -109,6 +109,7 @@ The foundation model is [DeBERTa-v3-large from Microsoft](https://huggingface.co
109
  ```python
110
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
111
  import torch
 
112
 
113
  model_name = "MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli"
114
  tokenizer = AutoTokenizer.from_pretrained(model_name)
@@ -157,8 +158,8 @@ The model achieves state-of-the-art performance on each dataset. Surprisingly, i
157
  ## Limitations and bias
158
  Please consult the original DeBERTa-v3 paper and literature on different NLI datasets for more information on the training data and potential biases. The model will reproduce statistical patterns in the training data.
159
 
160
- ### BibTeX entry and citation info
161
- If you want to cite this model, please cite my [preprint on low-resource text classification](https://osf.io/74b8k/) and the original DeBERTa-v3 paper.
162
 
163
  ### Ideas for cooperation or questions?
164
  If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
 
109
  ```python
110
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
111
  import torch
112
+ device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
113
 
114
  model_name = "MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli"
115
  tokenizer = AutoTokenizer.from_pretrained(model_name)
 
158
  ## Limitations and bias
159
  Please consult the original DeBERTa-v3 paper and literature on different NLI datasets for more information on the training data and potential biases. The model will reproduce statistical patterns in the training data.
160
 
161
+ ## Citation
162
+ If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k.
163
 
164
  ### Ideas for cooperation or questions?
165
  If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)