YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Model
mMiniLM-L12xH384 XLM-R model proposed in MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers that we fine-tune using the direct assessment annotations collected in the Workshop on Statistical Machine Translation (WMT) 2015 to 2020.
This model is much more light weight than the traditional XLM-RoBERTa base and large.
- Downloads last month
- 17,293
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.