llama33b-16k / README.md
chargoddard's picture
Create README.md
b0921ef
metadata
datasets:
  - EleutherAI/wikitext_document_level
tags:
  - llama

LLaMA 33b finetuned on wikitext_document_level with a linear ROPE scaling of 8, for a 16k token context length. This is a merged version of llama33b-16k-qlora.

Note that this is not an instruct model - this is base LLaMA with an extended sequence length.