XLM-RoBERTa models with continued pretraining on the MultiLegalPile
Joel Niklaus
joelniklaus
AI & ML interests
Pretraining, Instruction Tuning, Domain Adaptation, Benchmarks, Legal Datasets
Organizations
models
40
joelniklaus/legal-swiss-longformer-base
Fill-Mask
•
Updated
•
54
•
2
joelniklaus/legal-swiss-roberta-base
Fill-Mask
•
Updated
•
10
joelniklaus/legal-swiss-roberta-large
Fill-Mask
•
Updated
•
144
•
1
joelniklaus/legal-croatian-roberta-base
Fill-Mask
•
Updated
•
3
•
1
joelniklaus/legal-english-longformer-base
Updated
•
1
joelniklaus/legal-english-roberta-large
Fill-Mask
•
Updated
•
2
joelniklaus/legal-english-roberta-base
Fill-Mask
•
Updated
•
2
joelniklaus/legal-xlm-roberta-base
Fill-Mask
•
Updated
•
13
•
1
joelniklaus/legal-xlm-roberta-large
Fill-Mask
•
Updated
•
5
•
2
joelniklaus/legal-portuguese-roberta-base
Fill-Mask
•
Updated
•
69
datasets
26
joelniklaus/Multi_Legal_Pile
Updated
•
332
•
38
joelniklaus/Multi_Legal_Pile_Commercial
Viewer
•
Updated
•
12
•
5
joelniklaus/legalnero
Viewer
•
Updated
•
4
•
1
joelniklaus/greek_legal_ner
Viewer
•
Updated
•
2
joelniklaus/legal-mc4
Viewer
•
Updated
•
50
•
7
joelniklaus/MultiLegalPile_Chunks_4000
Viewer
•
Updated
joelniklaus/eurlex_resources
Updated
•
406
•
6
joelniklaus/lextreme
Viewer
•
Updated
•
59
•
16
joelniklaus/MultiLegalPileWikipediaFiltered
Viewer
•
Updated
•
301
•
3
joelniklaus/EU_Wikipedias
Viewer
•
Updated
•
52
•
1