File size: 315 Bytes
1bbf348
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
---
language:
- ms
---

# Full Parameter Finetuning 2B 32768 context length Llama2 on Malaysian text

2B derived from first 5 layers 13B model.

README at https://github.com/mesolitica/malaya/tree/5.1/session/llama2#2b-32768-context-length-flash-attention-2

WandB, https://wandb.ai/mesolitica/fpf-Llama-2-2b-32k-hf