--- license: apache-2.0 datasets: - togethercomputer/RedPajama-Data-1T-Sample language: - en --- # Landmark Attention LLaMA 33B This model has been trained using the PEFT LoRA method using the [Landmark Attention](https://arxiv.org/abs/2305.16300) method over 200 steps. Model will likely be trained further and updated later on. ## Usage Unlikely to be usable with the popular frontends (e.g. [KoboldAI](https://github.com/henk717/KoboldAI) and [Oobabooga](https://github.com/oobabooga/text-generation-webui)) due to the lack of support for landmark tokens.