VictorSanh commited on
Commit
baf1ac8
1 Parent(s): b7e1543

awq versions

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -237,7 +237,7 @@ Flash attention 2 support is available both for `idefics2-8b-base` and `idefics2
237
 
238
  </details>
239
 
240
- <!-- **4 bit quantization and module fusing**
241
 
242
  <details><summary>Click to expand.</summary>
243
 
@@ -266,7 +266,7 @@ model = AutoModelForVision2Seq.from_pretrained(
266
  ).to(DEVICE)
267
  ```
268
 
269
- </details> -->
270
 
271
  # Bias, Risks, and Limitations
272
 
 
237
 
238
  </details>
239
 
240
+ **4 bit quantization and module fusing**
241
 
242
  <details><summary>Click to expand.</summary>
243
 
 
266
  ).to(DEVICE)
267
  ```
268
 
269
+ </details>
270
 
271
  # Bias, Risks, and Limitations
272