File size: 585 Bytes
0219ad5
 
 
 
 
 
 
 
48ae928
 
 
 
8f02fd0
 
 
48ae928
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
license: other
language:
- en
base_model:
- Nexusflow/Athene-V2-Chat
tags:
- awq
- Athene
- Chat
pipeline_tag: text-generation
library_name: transformers
---
# Athene-V2-Chat AWQ 4-Bit Quantized Version

This repository provides the AWQ 4-bit quantized version of the Athene-V2-Chat model, originally developed by Nexusflow. This model's weights are padded with zeros before quantization to ensure compatibility with multi-GPU tensor parallelism by resolving divisibility constraints. The padding minimally impacts computation while enabling efficient scaling across multiple GPUs.