MoritzLaurer HF staff commited on
Commit
729456b
1 Parent(s): 1b7b1b2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -104,8 +104,18 @@ This model was fine-tuned on the [MultiNLI](https://huggingface.co/datasets/mult
104
 
105
  The foundation model is [DeBERTa-v3-large from Microsoft](https://huggingface.co/microsoft/deberta-v3-large). DeBERTa-v3 combines several recent innovations compared to classical Masked Language Models like BERT, RoBERTa etc., see the [paper](https://arxiv.org/abs/2111.09543)
106
 
107
- ## Intended uses & limitations
108
- #### How to use the model
 
 
 
 
 
 
 
 
 
 
109
  ```python
110
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
111
  import torch
104
 
105
  The foundation model is [DeBERTa-v3-large from Microsoft](https://huggingface.co/microsoft/deberta-v3-large). DeBERTa-v3 combines several recent innovations compared to classical Masked Language Models like BERT, RoBERTa etc., see the [paper](https://arxiv.org/abs/2111.09543)
106
 
107
+
108
+ ### How to use the model
109
+ #### Simple zero-shot classification pipeline
110
+ ```python
111
+ from transformers import pipeline
112
+ classifier = pipeline("zero-shot-classification", model="MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli")
113
+ sequence_to_classify = "Angela Merkel is a politician in Germany and leader of the CDU"
114
+ candidate_labels = ["politics", "economy", "entertainment", "environment"]
115
+ output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
116
+ print(output)
117
+ ```
118
+ #### NLI use-case
119
  ```python
120
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
121
  import torch