Edit model card

This is a Hugging Face transformers-compatible conversion of the original dense 355M-parameter model from the paper "Efficient Large Scale Language Modeling with Mixtures of Experts" from Artetxe et al. Please refer to the original model card, which can be found at https://github.com/facebookresearch/fairseq/blob/main/examples/moe_lm/model_card.md.

Downloads last month
199
Safetensors
Model size
405M params
Tensor type
F16
·
Hosted inference API
Text Generation
Examples
Examples
This model can be loaded on the Inference API on-demand.