akmohtashami's picture
add readme
0505628
|
raw
history blame
427 Bytes
---
license: cc-by-sa-4.0
---
# LLaMA-7B + Landmark Attention
This repo hosts the weight diff between LLaMA 7B trained with landmark attention for 15000 steps on RedPajama and the original model. Please visit the [Github repository](https://github.com/epfml/landmark-attention) for further instructions on how to recover the full weights and how to use them.
Github repository: <https://github.com/epfml/landmark-attention>