File size: 741 Bytes
a16e23b
 
e5b39f5
 
 
 
 
 
a16e23b
e5b39f5
229454a
 
 
e5b39f5
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: llama2
datasets:
- bertin-project/alpaca-spanish
language:
- es
library_name: transformers
pipeline_tag: text-generation
---
## Llama 2-13b-alpaca-spanish LoRA
This is a LoRA for Llama 2 13B trained on a translated [alpaca dataset](https://huggingface.co/datasets/bertin-project/alpaca-spanish) on an attempt to improve spanish performance of the Llama-2 foundation model with a conversational focus.

Base model used was [The Bloke's Llama-2-13B-fp16](https://huggingface.co/TheBloke/Llama-2-13B-fp16) trained in 4bit precision.

| Training parameteres      |   |
| ----------- | ----------- |
| LoRA scale  |   2      |
| Epochs     | 0.75        |
| Learning Rate| 2e-5     |
| Warmup Steps| 100     |
| Loss    | 1.07     |