XLM-RoBERTa models with continued pretraining on the MultiLegalPile
Joel Niklaus
joelniklaus
AI & ML interests
Pretraining, Instruction Tuning, Domain Adaptation, Benchmarks, Legal Judgment Prediction
Organizations
models
40

joelniklaus/legal-swiss-longformer-base
Fill-Mask
•
Updated
•
3
•
2

joelniklaus/legal-swiss-roberta-base
Fill-Mask
•
Updated
•
13

joelniklaus/legal-swiss-roberta-large
Fill-Mask
•
Updated
•
13
•
1

joelniklaus/legal-croatian-roberta-base
Fill-Mask
•
Updated
•
21
•
1

joelniklaus/legal-english-longformer-base
Updated
•
1

joelniklaus/legal-english-roberta-large
Fill-Mask
•
Updated
•
25

joelniklaus/legal-english-roberta-base
Fill-Mask
•
Updated
•
3

joelniklaus/legal-xlm-roberta-base
Fill-Mask
•
Updated
•
8
•
1

joelniklaus/legal-xlm-roberta-large
Fill-Mask
•
Updated
•
25
•
1

joelniklaus/legal-portuguese-roberta-base
Fill-Mask
•
Updated
•
34
datasets
26
joelniklaus/Multi_Legal_Pile_Commercial
Viewer
•
Updated
joelniklaus/Multi_Legal_Pile
Viewer
•
Updated
•
482
•
30
joelniklaus/legalnero
Viewer
•
Updated
•
7
joelniklaus/greek_legal_ner
Viewer
•
Updated
•
7
joelniklaus/legal-mc4
Viewer
•
Updated
•
98
•
5
joelniklaus/MultiLegalPile_Chunks_4000
Viewer
•
Updated
joelniklaus/eurlex_resources
Viewer
•
Updated
•
18
•
6
joelniklaus/lextreme
Viewer
•
Updated
•
410
•
16
joelniklaus/MultiLegalPileWikipediaFiltered
Viewer
•
Updated
•
75
•
2
joelniklaus/EU_Wikipedias
Viewer
•
Updated
•
98
•
1