File size: 797 Bytes
efd0619
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5cbaee0
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
license: apache-2.0
datasets:
- togethercomputer/RedPajama-Data-1T-Sample
language:
- en
---

# Landmark Attention LLaMA 33B

This model has been trained using the PEFT LoRA method using the [Landmark Attention](https://arxiv.org/abs/2305.16300) method over 200 steps. Model will likely be trained further and updated later on.

## Usage

Unlikely to be usable with the popular frontends (e.g. [KoboldAI](https://github.com/henk717/KoboldAI) and [Oobabooga](https://github.com/oobabooga/text-generation-webui)) due to the lack of support for landmark tokens.

## PEFT Checkpoint

You can likely merge the checkpoint with any other LLaMA-based model (provided they're 33B, of course). This repo contains the merged weights, but you can grab the adapter [here](https://anonfiles.com/F3Pb20wbz7).