saarvajanik's picture
Update README.md
d8c6c3f verified
|
raw
history blame contribute delete
No virus
418 Bytes
metadata
license: mit
language:
  - en
library_name: transformers

This is a GQA version of the original model facebook/opt-125m. In this version, the original MHA architecture is preserved but instead of having a single K/V head, different K/V heads corresponding to the same group have the same mean-pooled K or V values. It has 16 groups of KV heads per layer instead of original 32 KV heads in the MHA implementation.