Update README.md
Browse files
README.md
CHANGED
@@ -3,4 +3,6 @@ license: mit
|
|
3 |
language:
|
4 |
- en
|
5 |
library_name: transformers
|
6 |
-
---
|
|
|
|
|
|
3 |
language:
|
4 |
- en
|
5 |
library_name: transformers
|
6 |
+
---
|
7 |
+
|
8 |
+
This is a GQA version of the original model facebook/opt-125m. In this version, the original MHA architecture is preserved but instead of having a single K/V head, different K/V heads corresponding to the same group have the same mean-pooled K or V values. It has 6 groups of KV heads per layer instead of original 12 KV heads in the MHA implementation.
|