File size: 1,113 Bytes
7e7f809
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
---
language: en
tags:
- text-classification
- pytorch
- tensorflow
datasets:
- go_emotions
license: mit
---

# distilbert-base-uncased-go-emotions-student

## Model Description

This model is distilled from the zero-shot classification pipeline on the unlabeled GoEmotions dataset using [this
script](https://github.com/huggingface/transformers/tree/master/examples/research_projects/zero-shot-distillation).
It is the result of the demo notebook
[here](https://colab.research.google.com/drive/1mjBjd0cR8G57ZpsnFCS3ngGyo5nCa9ya?usp=sharing), where more details
about the model can be found.

- Teacher model: [roberta-large-mnli](https://huggingface.co/roberta-large-mnli)

## Intended Usage

The model can be used like any other model trained on GoEmotions, but will likely not perform as well as a model
trained with full supervision. It is primarily intended as a demo of how an expensive NLI-based zero-shot model
can be distilled to a more efficient student. Note that although the GoEmotions dataset allow multiple labels
per instance, the teacher used single-label classification to create psuedo-labels.