VictorSanh commited on
Commit
6e13295
1 Parent(s): 2c25217

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -247,7 +247,7 @@ Flash attention 2 support is available both for `idefics2-8b-base` and `idefics2
247
 
248
  <details><summary>Click to expand.</summary>
249
 
250
- 4-bit AWQ-quantized versions of the checkpoints are also available and allow module fusing for accelerated inference. First make sure you install the Auto-AWQ library with `pip install autoawq`. Also make sure that this [fix] is integrated into your installation.
251
 
252
  ```diff
253
  + from transformers import AwqConfig
 
247
 
248
  <details><summary>Click to expand.</summary>
249
 
250
+ 4-bit AWQ-quantized versions of the checkpoints are also available and allow module fusing for accelerated inference. First make sure you install the Auto-AWQ library with `pip install autoawq`. Also make sure that this [fix](https://github.com/casper-hansen/AutoAWQ/pull/444) is integrated into your installation.
251
 
252
  ```diff
253
  + from transformers import AwqConfig