LegalLMs
Collection
XLM-RoBERTa models with continued pretraining on the MultiLegalPile
•
37 items
•
Updated
•
2
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.9507 | 9.02 | 50000 | 0.3073 |
0.8777 | 19.02 | 100000 | 0.2592 |
0.6977 | 29.01 | 150000 | 0.2398 |
0.6732 | 39.01 | 200000 | 0.2365 |