Pedrada commited on
Commit
eb0257d
1 Parent(s): fed478f

Add base model link

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -10,7 +10,8 @@ datasets:
10
 
11
  # Twitter 2022 154M (RoBERTa-large, 154M - full update)
12
 
13
- This is a RoBERTa-large model trained on 154M tweets until the end of December 2022 (from original checkpoint, no incremental updates).
 
14
 
15
  These 154M tweets result from filtering 220M tweets obtained exclusively from the Twitter Academic API, covering every month between 2018-01 and 2022-12.
16
  Filtering and preprocessing details are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).
 
10
 
11
  # Twitter 2022 154M (RoBERTa-large, 154M - full update)
12
 
13
+ This is a RoBERTa-large model trained on 154M tweets until the end of December 2022 (from original checkpoint, no incremental updates).
14
+ A base model trained on the same datais available [here](https://huggingface.co/cardiffnlp/twitter-roberta-base-2022-154m).
15
 
16
  These 154M tweets result from filtering 220M tweets obtained exclusively from the Twitter Academic API, covering every month between 2018-01 and 2022-12.
17
  Filtering and preprocessing details are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).