dfurman commited on
Commit
271b00c
1 Parent(s): abdcfef

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -61,7 +61,7 @@ LLaMA is a foundational model, and as such, it should not be used for downstream
61
  ```
62
  ### GPU Inference in fp16
63
 
64
- This requires a GPU with at least xxGB of VRAM.
65
 
66
  ### First, Load the Model
67
 
 
61
  ```
62
  ### GPU Inference in fp16
63
 
64
+ This requires a GPU with at least 26GB of VRAM.
65
 
66
  ### First, Load the Model
67