File size: 359 Bytes
e7ed84b
 
0716c61
 
e7ed84b
0716c61
fe21a74
0716c61
cccb0f5
1
2
3
4
5
6
7
8
9
---
license: mit
language:
- en
---

The BERT model pretrained on the news corpus (https://huggingface.co/datasets/yyu/wiki_corpus). Used in the paper ``ReGen: Zero-Shot Text Classification via Training Data Generation with Progressive Dense Retrieval``.

See github: https://github.com/yueyu1030/ReGen and paper: https://arxiv.org/abs/2305.10703 for details.