huseinzol05's picture
Update README.md
8fc7650
|
raw
history blame
314 Bytes
metadata
language:
  - ms

Full Parameter Finetuning 1B 32768 context length Llama2 on Malaysian text

1B derived from first 4 layers 7B model.

README at https://github.com/mesolitica/malaya/tree/5.1/session/llama2#1b-32768-context-length-flash-attention-2

WandB, https://wandb.ai/mesolitica/fpf-Llama-2-1b-32k-hf