Zymrael
chore: sync readme
1e81fed
|
raw
history blame
711 Bytes
---
license: apache-2.0
language:
- en
---
## StripedHyena-Hessian-7B (SH-7B)
### Model Architecture
The architecture of StripedHyena-Hessian-7B is different from traditional decoder-only Transformers.
StripedHyena is a hybrid architecture composed of multi-head, grouped-query attention and gated convolutions arranged in [Hyena](https://arxiv.org/abs/2302.10866) blocks.
- Costant memory decoding by representation of convolutions as state-space models (modal or canonical form), or as truncated filters.
- Lower latency to preprocess long prompts.
- Improvements to training and inference compute-optimal scaling laws, compared to Transformers.
>>>>>>> 70481ea0fbb23e43c66663f8fb40d94661f235f0