ArthurZ HF staff commited on
Commit
0edb56d
·
1 Parent(s): 77a5cd6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -1,7 +1,7 @@
1
  # This repo shows how to convert a fairseq NLLB-MoE model to transformers and run a forward pass
2
 
3
  As the `fairseq` repository is not really optimised to run inference out-of-the-box, make sure you have a very very big CPU/GPU RAM.
4
- Around 600 GB are required to run an inference with the `fairseq` model, as you need to load the checkpoints (~300GB) then build the model (~300GB again), then finally you can load the checkpoints in the model.
5
 
6
  ## 0. Download the original checkpoints:
7
  The checkpoints in this repository were obtained using the following command (ased on the instructions given on the fairseq repository):
 
1
  # This repo shows how to convert a fairseq NLLB-MoE model to transformers and run a forward pass
2
 
3
  As the `fairseq` repository is not really optimised to run inference out-of-the-box, make sure you have a very very big CPU/GPU RAM.
4
+ Around 600 GB are required to run an inference with the `fairseq` model, as you need to load the checkpoints (\~300GB) then build the model (\~300GB again), then finally you can load the checkpoints in the model.
5
 
6
  ## 0. Download the original checkpoints:
7
  The checkpoints in this repository were obtained using the following command (ased on the instructions given on the fairseq repository):