Cardiff NLP
commited on
Commit
•
1aae9ca
1
Parent(s):
55daec6
Update README.md
Browse files
README.md
CHANGED
@@ -1,11 +1,11 @@
|
|
1 |
# Twitter June 2021 (RoBERTa-base, 115M)
|
2 |
|
3 |
This is a RoBERTa-base model trained on 115.46M tweets until the end of June 2021.
|
4 |
-
More details and performance scores are available in the [TimeLMs paper](https://arxiv.org/
|
5 |
|
6 |
Below, we provide some usage examples using the standard Transformers interface. For another interface more suited to comparing predictions and perplexity scores between models trained at different temporal intervals, check the [TimeLMs repository](https://github.com/cardiffnlp/timelms).
|
7 |
|
8 |
-
For other models trained until different periods, check [
|
9 |
|
10 |
## Preprocess Text
|
11 |
Replace usernames and links for placeholders: "@user" and "http".
|
|
|
1 |
# Twitter June 2021 (RoBERTa-base, 115M)
|
2 |
|
3 |
This is a RoBERTa-base model trained on 115.46M tweets until the end of June 2021.
|
4 |
+
More details and performance scores are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).
|
5 |
|
6 |
Below, we provide some usage examples using the standard Transformers interface. For another interface more suited to comparing predictions and perplexity scores between models trained at different temporal intervals, check the [TimeLMs repository](https://github.com/cardiffnlp/timelms).
|
7 |
|
8 |
+
For other models trained until different periods, check this [table](https://github.com/cardiffnlp/timelms#released-models).
|
9 |
|
10 |
## Preprocess Text
|
11 |
Replace usernames and links for placeholders: "@user" and "http".
|