jwu323 commited on
Commit
d55c2ba
·
1 Parent(s): 48a09bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -1,3 +1,5 @@
1
  This contains the original weights for the LLaMA-7b model.
2
  This model is under a non-commercial license (see the LICENSE file).
3
  You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform) but either lost your copy of the weights or got some trouble converting them to the Transformers format.
 
 
 
1
  This contains the original weights for the LLaMA-7b model.
2
  This model is under a non-commercial license (see the LICENSE file).
3
  You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform) but either lost your copy of the weights or got some trouble converting them to the Transformers format.
4
+
5
+ [According to this comment](https://github.com/huggingface/transformers/issues/21681#issuecomment-1436552397), dtype of a model in PyTorch is always float32, regardless of the dtype of the checkpoint you saved. If you load a float16 checkpoint in a model you create (which is in float32 by default), the dtype that is kept at the end is the dtype of the model, not the dtype of the checkpoint.