hancheolp commited on
Commit
7a54bdc
1 Parent(s): ec57be5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +31 -3
README.md CHANGED
@@ -1,3 +1,31 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ base_model:
6
+ - FacebookAI/roberta-base
7
+ ---
8
+ # Ambiguity-aware RoBERTa
9
+
10
+ This model is trained on SemEval2007 Task 14 Affective Text dataset and is capable of representing the ambiguity occurring in emotion analysis tasks as an accurate distribution (i.e., softmax output). It was introduced in the following paper: ["Deep Model Compression Also Helps Models Capture Ambiguity"](https://aclanthology.org/2023.acl-long.381.pdf) (ACL 2023).
11
+
12
+ # Usage
13
+
14
+ ```python
15
+ from transformers import RobertaTokenizer, RobertaForSequenceClassification
16
+ tokenizer = RobertaTokenizer.from_pretrained('hancheolp/ambiguity-aware-roberta-emotion')
17
+ model = RobertaForSequenceClassification.from_pretrained('hancheolp/ambiguity-aware-roberta-emotion')
18
+ news_headline = "Amateur rocket scientists reach for space."
19
+ encoded_input = tokenizer(news_headline, return_tensors='pt')
20
+ output = model(**encoded_input)
21
+ distribution = output.logits.softmax(dim=-1)
22
+ print(distribution)
23
+ ```
24
+
25
+ Each index of the output vector represents the following:
26
+ * 0: anger
27
+ * 1: disgust
28
+ * 2: fear
29
+ * 3: joy
30
+ * 4: sadness
31
+ * 5: surprise