MoritzLaurer HF staff commited on
Commit
574dbac
1 Parent(s): 297ea39

update readme with easier zeroshot code

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -38,8 +38,18 @@ This multilingual model can perform natural language inference (NLI) on 100 lang
38
  As of December 2021, mDeBERTa-base is the best performing multilingual base-sized transformer model, introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf).
39
 
40
 
41
- ## Intended uses & limitations
42
- #### How to use the model
 
 
 
 
 
 
 
 
 
 
43
  ```python
44
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
45
  import torch
 
38
  As of December 2021, mDeBERTa-base is the best performing multilingual base-sized transformer model, introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf).
39
 
40
 
41
+ ### How to use the model
42
+ #### Simple zero-shot classification pipeline
43
+ ```python
44
+ from transformers import pipeline
45
+ classifier = pipeline("zero-shot-classification", model="MoritzLaurer/mDeBERTa-v3-base-mnli-xnli")
46
+
47
+ sequence_to_classify = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"
48
+ candidate_labels = ["politics", "economy", "entertainment", "environment"]
49
+ output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
50
+ print(output)
51
+ ```
52
+ #### NLI use-case
53
  ```python
54
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
55
  import torch