https://huggingface.co/facebook/nllb-200-distilled-1.3B

ct2-transformers-converter --model facebook/nllb-200-distilled-1.3B --quantization int8 --output_dir converted/nllb-200-distilled-1.3B-ct2-int8
Downloads last month
83
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Space using JustFrederik/nllb-200-distilled-1.3B-ct2-int8 1