xaviviro commited on
Commit
3271280
1 Parent(s): 9b41a0b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +83 -0
README.md ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: xaviviro/FLAMA-0.5-3B
4
+ language:
5
+ - ca
6
+ - es
7
+ - en
8
+ model_creator: xaviviro
9
+ model_name: FLAMA-0.5-3B
10
+ prompt_template: '<|im_start|>user\n{instruction}<|im_end|>\n<|im_start|>assistant\n'
11
+ quantized_by: xaviviro
12
+ ---
13
+
14
+ # FLAMA: Model 3B ChatML en Catal脿 i Castell脿. Versi贸 0.5
15
+
16
+ ![FLAMA](flama05.png)
17
+
18
+
19
+ FLAMA 茅s el primer model petit 3B biling眉e en catal脿 i castell脿. 脡s el resultat de finetunejar el model [open_llama_3b_v2](/openlm-research/open_llama_3b_v2) amb les instruccions d'[OpenAssistant v2](/datasets/OpenAssistant/oasst2) tradu茂des autom脿ticament al catal脿 i al castell脿 amb recursos de [Helsinki-NLP](/Helsinki-NLP) i tractades en format ChatML.
20
+
21
+
22
+ ## Novetats de la versi贸 0.5
23
+
24
+ 1. Catal脿 millorat
25
+ 1. Afegit el Castell脿
26
+
27
+
28
+
29
+ # Prompt Template
30
+
31
+ FLAMA usa ChatML com a prompt template:
32
+
33
+ ```
34
+ <|im_start|>user
35
+ Qui va ser Isaac Newton?<|im_end|>
36
+ <|im_start|>assistant\n
37
+ ```
38
+ ```
39
+ <|im_start|>user
40
+ Quien fu茅 Isaac Newton?<|im_end|>
41
+ <|im_start|>assistant\n
42
+ ```
43
+
44
+ [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
45
+
46
+ ## Refer猫ncies
47
+
48
+ ```
49
+ @software{xaviviro2023flama,
50
+ author = {xaviviro},
51
+ title = {FLAMA: Model 3B ChatML en Catal脿. Versi贸 0.5},
52
+ month = January,
53
+ year = 2024,
54
+ url = {https://huggingface.co/xaviviro/FLAMA-0.5-3B}
55
+ }
56
+ ```
57
+
58
+ ```
59
+ @software{openlm2023openllama,
60
+ author = {Geng, Xinyang and Liu, Hao},
61
+ title = {OpenLLaMA: An Open Reproduction of LLaMA},
62
+ month = May,
63
+ year = 2023,
64
+ url = {https://github.com/openlm-research/open_llama}
65
+ }
66
+ ```
67
+ ```
68
+ @software{together2023redpajama,
69
+ author = {Together Computer},
70
+ title = {RedPajama-Data: An Open Source Recipe to Reproduce LLaMA training dataset},
71
+ month = April,
72
+ year = 2023,
73
+ url = {https://github.com/togethercomputer/RedPajama-Data}
74
+ }
75
+ ```
76
+ ```
77
+ @article{touvron2023llama,
78
+ title={Llama: Open and efficient foundation language models},
79
+ author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and others},
80
+ journal={arXiv preprint arXiv:2302.13971},
81
+ year={2023}
82
+ }
83
+ ```