Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
# This repo shows how to convert a fairseq NLLB-MoE model to transformers and run a forward pass
|
2 |
|
3 |
As the `fairseq` repository is not really optimised to run inference out-of-the-box, make sure you have a very very big CPU/GPU RAM.
|
4 |
-
Around 600 GB are required to run an inference with the `fairseq` model, as you need to load the checkpoints (
|
5 |
|
6 |
## 0. Download the original checkpoints:
|
7 |
The checkpoints in this repository were obtained using the following command (ased on the instructions given on the fairseq repository):
|
|
|
1 |
# This repo shows how to convert a fairseq NLLB-MoE model to transformers and run a forward pass
|
2 |
|
3 |
As the `fairseq` repository is not really optimised to run inference out-of-the-box, make sure you have a very very big CPU/GPU RAM.
|
4 |
+
Around 600 GB are required to run an inference with the `fairseq` model, as you need to load the checkpoints (\~300GB) then build the model (\~300GB again), then finally you can load the checkpoints in the model.
|
5 |
|
6 |
## 0. Download the original checkpoints:
|
7 |
The checkpoints in this repository were obtained using the following command (ased on the instructions given on the fairseq repository):
|