Cardiff NLP
commited on
Commit
•
b28ca26
1
Parent(s):
df7d44c
Update README.md
Browse files
README.md
CHANGED
@@ -1,11 +1,11 @@
|
|
1 |
# Twitter 2021 90M (RoBERTa-base)
|
2 |
|
3 |
This is a RoBERTa-base model trained on 90M tweets until the end of 2019.
|
4 |
-
More details and performance scores are available in the [TimeLMs paper](https://arxiv.org/
|
5 |
|
6 |
Below, we provide some usage examples using the standard Transformers interface. For another interface more suited to comparing predictions and perplexity scores between models trained at different temporal intervals, check the [TimeLMs repository](https://github.com/cardiffnlp/timelms).
|
7 |
|
8 |
-
For other models trained until different periods, check [
|
9 |
|
10 |
## Preprocess Text
|
11 |
Replace usernames and links for placeholders: "@user" and "http".
|
|
|
1 |
# Twitter 2021 90M (RoBERTa-base)
|
2 |
|
3 |
This is a RoBERTa-base model trained on 90M tweets until the end of 2019.
|
4 |
+
More details and performance scores are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).
|
5 |
|
6 |
Below, we provide some usage examples using the standard Transformers interface. For another interface more suited to comparing predictions and perplexity scores between models trained at different temporal intervals, check the [TimeLMs repository](https://github.com/cardiffnlp/timelms).
|
7 |
|
8 |
+
For other models trained until different periods, check this [table](https://github.com/cardiffnlp/timelms#released-models).
|
9 |
|
10 |
## Preprocess Text
|
11 |
Replace usernames and links for placeholders: "@user" and "http".
|