alokabhishek
commited on
Commit
•
bac8706
1
Parent(s):
d59c585
Updated Readme
Browse files
README.md
CHANGED
@@ -31,8 +31,8 @@ AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration i
|
|
31 |
|
32 |
### About 4 bit quantization using AutoAWQ
|
33 |
|
34 |
-
AutoAWQ github repo: [AutoAWQ github repo](https://github.com/casper-hansen/AutoAWQ/tree/main)
|
35 |
-
MIT-han-lab llm-aws github repo: [MIT-han-lab llm-aws github repo](https://github.com/mit-han-lab/llm-awq/tree/main)
|
36 |
|
37 |
@inproceedings{lin2023awq,
|
38 |
title={AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration},
|
|
|
31 |
|
32 |
### About 4 bit quantization using AutoAWQ
|
33 |
|
34 |
+
- AutoAWQ github repo: [AutoAWQ github repo](https://github.com/casper-hansen/AutoAWQ/tree/main)
|
35 |
+
- MIT-han-lab llm-aws github repo: [MIT-han-lab llm-aws github repo](https://github.com/mit-han-lab/llm-awq/tree/main)
|
36 |
|
37 |
@inproceedings{lin2023awq,
|
38 |
title={AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration},
|