File size: 579 Bytes
437d1e4
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
---
license: mit
language:
- en
---
DistilBERT model fine-tuned on 117567 English language tweets from a range of political agencies (EU_Commission, UN, OECD, IMF, ECB, Council of the European Union, UK government, Scottish government).
Fine-tuned with learning rate = 2e-5, 16-sample batches, chunk_size = 50 tokens, 100 epochs (early stopping patience = 3 epochs), 3 warmup epochs.
More details can be found at: https://github.com/rbroc/eucomm-twitter.
No evaluation was performed, as fine-tuning was merely functional to providing checkpoints for a contextualized topic model.