Claudio Aracena
commited on
Commit
•
88e834c
1
Parent(s):
9e50e38
update README.md
Browse files
README.md
CHANGED
@@ -0,0 +1,72 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: openrail
|
3 |
+
datasets:
|
4 |
+
- bertin-project/alpaca-spanish
|
5 |
+
language:
|
6 |
+
- es
|
7 |
+
---
|
8 |
+
|
9 |
+
# Guanaco: spanish InstructLlama
|
10 |
+
|
11 |
+
## Usage
|
12 |
+
|
13 |
+
Check the Github repo with code: https://github.com/plncmm/guanaco-lora
|
14 |
+
|
15 |
+
```python
|
16 |
+
from peft import PeftModel
|
17 |
+
from transformers import LlamaForCausalLM, LlamaTokenizer, GenerationConfig
|
18 |
+
|
19 |
+
tokenizer = LlamaTokenizer.from_pretrained("decapoda-research/llama-13b-hf")
|
20 |
+
model = LlamaForCausalLM.from_pretrained(
|
21 |
+
"decapoda-research/llama-13b-hf",
|
22 |
+
load_in_8bit=True,
|
23 |
+
device_map="auto",
|
24 |
+
)
|
25 |
+
model = PeftModel.from_pretrained(model, "plncmm/guanaco-lora-13b")
|
26 |
+
|
27 |
+
```
|
28 |
+
|
29 |
+
## Data
|
30 |
+
|
31 |
+
We used the [alpaca-spanish dataset](https://huggingface.co/datasets/bertin-project/alpaca-spanish), which is a traslation of [alpaca_data.json](https://github.com/tatsu-lab/stanford_alpaca/blob/main/alpaca_data.json).
|
32 |
+
|
33 |
+
## Finetuning
|
34 |
+
|
35 |
+
To finetuned the LLaMA model we used the code available on [Alpaca Lora](https://github.com/tloen/alpaca-lora) (also in [Cabrita](https://github.com/22-hours/cabrita)), which provides code to finetune the LLaMA model using PEFT from Hugging Face. With this, we could run our finetuning step using 1 A100 on top of LLaMA-7B and LLaMa-13B. The code we used is available [here](https://github.com/plncmm/guanaco-lora/blob/main/guanaco-lora.py).
|
36 |
+
|
37 |
+
|
38 |
+
## Examples
|
39 |
+
|
40 |
+
Guanaco
|
41 |
+
```
|
42 |
+
Instrucción: nombre las capitales regionales de Chile
|
43 |
+
Respuesta: Las capitales regionales de Chile son Arica, Iquique, Antofagasta, Calama, Copiapó, Vallenar, La Serena, Coquimbo, Valparaíso, Viña del Mar, Concepción, Temuco, Puerto Montt, Punta Arenas y Puerto Natales.
|
44 |
+
```
|
45 |
+
|
46 |
+
Guanaco
|
47 |
+
```
|
48 |
+
Instrucción: dime sobre guanacos
|
49 |
+
Respuesta: Los guanacos son mamíferos sudamericanos de la familia de los cérvidos. Son herbívoros y se alimentan de hierba, hojas y frutas. Viven en grupos de hasta 20 individuos y tienen una vida útil de 15 a 20 años. Los guanacos tienen un pelaje marrón oscuro y una larga cola blanca. Son conocidos por su llamada distintiva, que suena como un silbido.
|
50 |
+
```
|
51 |
+
|
52 |
+
Guanaco
|
53 |
+
```
|
54 |
+
Instrucción: dime sobre el presidente de mexico en 2019
|
55 |
+
Respuesta: En 2019, el presidente de México es Andrés Manuel López Obrador.
|
56 |
+
```
|
57 |
+
|
58 |
+
Guanaco
|
59 |
+
```
|
60 |
+
Instrucción: dime sobre el rey de francia en 2019
|
61 |
+
Respuesta: El rey de Francia en 2019 es Felipe VI.
|
62 |
+
```
|
63 |
+
|
64 |
+
Guanaco
|
65 |
+
```
|
66 |
+
Instrucción: escribe un programa de Python que imprima los primeros 10 números de Fibonacci
|
67 |
+
Respuesta: def fibonacci(n): si n == 0: regresar 0 elif n == 1: regresar 1 else: regresar fibonacci(n-1) + fibonacci(n-2)
|
68 |
+
```
|
69 |
+
|
70 |
+
## References
|
71 |
+
|
72 |
+
We started this section with this citation because everything we did was only possible due to the strong community and works that other people and groups did. For our work, we rely mainly in the works developed by: [LLaMA](https://ai.facebook.com/blog/large-language-model-llama-meta-ai/), [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca), [Alpaca Lora](https://github.com/tloen/alpaca-lora), [Cabrita](https://github.com/22-hours/cabrita), [Bertin](https://huggingface.co/bertin-project), [ChatGPT](https://openai.com/blog/chatgpt) and [Hugging Face](https://huggingface.co/). So, thank you all for the great work and open this to the world!
|