Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,6 @@
|
|
|
|
|
|
|
|
1 |
# <a name="introduction"></a> BERTweet: A pre-trained language model for English Tweets
|
2 |
|
3 |
BERTweet is the first public large-scale language model pre-trained for English Tweets. BERTweet is trained based on the [RoBERTa](https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.md) pre-training procedure. The corpus used to pre-train BERTweet consists of 850M English Tweets (16B word tokens ~ 80GB), containing 845M Tweets streamed from 01/2012 to 08/2019 and 5M Tweets related to the **COVID-19** pandemic. The general architecture and experimental results of BERTweet can be found in our [paper](https://aclanthology.org/2020.emnlp-demos.2/):
|
@@ -12,5 +15,4 @@ BERTweet is the first public large-scale language model pre-trained for English
|
|
12 |
|
13 |
**Please CITE** our paper when BERTweet is used to help produce published results or is incorporated into other software.
|
14 |
|
15 |
-
For further information or requests, please go to [BERTweet's homepage](https://github.com/VinAIResearch/BERTweet)!
|
16 |
-
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
---
|
4 |
# <a name="introduction"></a> BERTweet: A pre-trained language model for English Tweets
|
5 |
|
6 |
BERTweet is the first public large-scale language model pre-trained for English Tweets. BERTweet is trained based on the [RoBERTa](https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.md) pre-training procedure. The corpus used to pre-train BERTweet consists of 850M English Tweets (16B word tokens ~ 80GB), containing 845M Tweets streamed from 01/2012 to 08/2019 and 5M Tweets related to the **COVID-19** pandemic. The general architecture and experimental results of BERTweet can be found in our [paper](https://aclanthology.org/2020.emnlp-demos.2/):
|
|
|
15 |
|
16 |
**Please CITE** our paper when BERTweet is used to help produce published results or is incorporated into other software.
|
17 |
|
18 |
+
For further information or requests, please go to [BERTweet's homepage](https://github.com/VinAIResearch/BERTweet)!
|
|