AntoineBlanot commited on
Commit
e7c8e32
1 Parent(s): d2fe8b6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -2
README.md CHANGED
@@ -22,8 +22,23 @@ By removing the decoder we can *half the original number of parameters* (thus ha
22
 
23
  ## Table of Contents
24
 
25
- 0. [Why use T5ForSequenceClassification?](##why-use-t5forsequenceclassification?)
26
- 1. [T5ForClassification vs T5](##t5forclassification-vs-t5)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27
 
28
  ## Why use T5ForSequenceClassification?
29
  Models based on the [BERT](https://huggingface.co/bert-large-uncased) architecture like [RoBERTa](https://huggingface.co/roberta-large) and [DeBERTa](https://huggingface.co/microsoft/deberta-v2-xxlarge) have shown very strong performance on sequence classification task and are still widely used today.
 
22
 
23
  ## Table of Contents
24
 
25
+ 0. [Usage](##usage)
26
+ 1. [Why use T5ForSequenceClassification?](##why-use-t5forsequenceclassification?)
27
+ 2. [T5ForClassification vs T5](##t5forclassification-vs-t5)
28
+
29
+ ## Usage
30
+ **T5ForSequenceClassification** supports the task of zero-shot classification.
31
+ It can direclty be used for:
32
+ - topic classification
33
+ - intent recognition
34
+ - boolean question answering
35
+ - sentiment analysis
36
+ - and any other task which goal is to clasify a text...
37
+
38
+ Since the *T5ForClassification* class is currently not supported by the transformers library, you cannot direclty use this model on the Hub.
39
+ To use **T5ForSequenceClassification**, you will have to install additional packages and model weights.
40
+ You can find instructions [here](https://github.com/AntoineBlanot/zero-nlp).
41
+
42
 
43
  ## Why use T5ForSequenceClassification?
44
  Models based on the [BERT](https://huggingface.co/bert-large-uncased) architecture like [RoBERTa](https://huggingface.co/roberta-large) and [DeBERTa](https://huggingface.co/microsoft/deberta-v2-xxlarge) have shown very strong performance on sequence classification task and are still widely used today.