VictorSanh
commited on
awq versions
Browse files
README.md
CHANGED
@@ -237,7 +237,7 @@ Flash attention 2 support is available both for `idefics2-8b-base` and `idefics2
|
|
237 |
|
238 |
</details>
|
239 |
|
240 |
-
|
241 |
|
242 |
<details><summary>Click to expand.</summary>
|
243 |
|
@@ -266,7 +266,7 @@ model = AutoModelForVision2Seq.from_pretrained(
|
|
266 |
).to(DEVICE)
|
267 |
```
|
268 |
|
269 |
-
</details>
|
270 |
|
271 |
# Bias, Risks, and Limitations
|
272 |
|
|
|
237 |
|
238 |
</details>
|
239 |
|
240 |
+
**4 bit quantization and module fusing**
|
241 |
|
242 |
<details><summary>Click to expand.</summary>
|
243 |
|
|
|
266 |
).to(DEVICE)
|
267 |
```
|
268 |
|
269 |
+
</details>
|
270 |
|
271 |
# Bias, Risks, and Limitations
|
272 |
|