LiLT + XLM-RoBERTa-base

This model is created by combining the Language-Independent Layout Transformer (LiLT) with XLM-RoBERTa, a multilingual RoBERTa model (trained on 100 languages).

This way, we have a LayoutLM-like model for 100 languages :)

Downloads last month
47,863
Safetensors
Model size
284M params
Tensor type
I64
Β·
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for nielsr/lilt-xlm-roberta-base

Finetunes
24 models

Spaces using nielsr/lilt-xlm-roberta-base 6