RuterNorway commited on
Commit
97ab16f
1 Parent(s): 381dbb5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -8
README.md CHANGED
@@ -15,13 +15,19 @@ datasets:
15
  ---
16
  # Llama 2 13b Chat Norwegian GPTQ
17
  **This is a GPTQ version of Llama 2 13b Norwegian**
18
- Read more about [GPTQ here](https://towardsdatascience.com/4-bit-quantization-with-gptq-36b0f4f02c34). For a demo script, see [here](#demo-script).
19
 
20
- Llama-2-13b-chat-norwegian is a variant of [Meta](https://huggingface.co/meta-llama)´s [Llama 2 13b Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) model, finetuned on a mix of norwegian datasets created in [Ruter AI Lab](https://ruter.no) the summer of 2023.
 
 
 
 
21
 
22
  The model is tuned to understand and generate text in Norwegian. It's trained for one epoch on norwegian-alpaca + 15000 samples of machine-translated data from OpenOrca (the dataset to be released). A small subset of custom-made instructional data is also included.
23
 
24
- We are currently investigating the possibility of releasing a larger, more powerful model and making GGML and GPTQ versions available for this model.
 
 
 
25
 
26
 
27
  ## Data
@@ -85,13 +91,32 @@ This model was made at Ruters AI Lab - summer of 2023 as part of their AI initia
85
  The team wants to thank the support we got from the entire Ruter organization, and especially the Data Science team.
86
 
87
  ___
88
- # Llama 2 13b Chat Norwegian LoRA-adapter (Norsk)
89
- **Dette er LoRA-adapteren for Llama 2 13b Chat Norwegian modellen, og krever den orginale basismodellen for å kjøre**
90
- Llama-2-13b-chat-norwegian er en versjon av [Meta](https://huggingface.co/meta-llama) sin [Llama 2 13b Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) model, finetuned på en kombinasjon av diverse norske datasett. Modellen ble laget i [Ruter AI Lab](https://ruter.no) 2023.
 
 
 
 
 
 
 
 
 
 
 
 
 
91
 
92
- Modellen er finetuned til å forstå og generere tekst på Norsk. Den er trent i én epoch med norwegian-alpaca + et utvalg av 15000 maskinoversatt data fra OpenOrca (datasett venter på utgivelse). Det består og av et lite sett med selvlagde instruksjonsdata
 
 
 
 
 
 
 
93
 
94
- Vi undersøker for øyeblikekt muligheten for å gi ut en større og sterkere modell i framtiden, og å lage en GGML og GPTQ versjon tilgjengelig for denne modellen.
95
 
96
 
97
  ## Data
 
15
  ---
16
  # Llama 2 13b Chat Norwegian GPTQ
17
  **This is a GPTQ version of Llama 2 13b Norwegian**
 
18
 
19
+ Read more about [GPTQ here](https://towardsdatascience.com/4-bit-quantization-with-gptq-36b0f4f02c34).
20
+
21
+ For a demo script, see [here](#demo-script).
22
+
23
+ Llama-2-13b-chat-norwegian-GPTQ is a variant of [Meta](https://huggingface.co/meta-llama)´s [Llama 2 13b Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) model, finetuned on a mix of norwegian datasets created in [Ruter AI Lab](https://ruter.no) the summer of 2023.
24
 
25
  The model is tuned to understand and generate text in Norwegian. It's trained for one epoch on norwegian-alpaca + 15000 samples of machine-translated data from OpenOrca (the dataset to be released). A small subset of custom-made instructional data is also included.
26
 
27
+ For other versions of this model see:
28
+ * [Llama-2-13b-chat-norwegian](https://huggingface.co/RuterNorway/Llama-2-13b-chat-norwegian)
29
+ * [Llama-2-13b-chat-norwegian-LoRa](https://huggingface.co/RuterNorway/Llama-2-13b-chat-norwegian-LoRa)
30
+ * [Llama-2-13b-chat-norwegian-GPTQ](https://huggingface.co/RuterNorway/Llama-2-13b-chat-norwegian-GPTQ)
31
 
32
 
33
  ## Data
 
91
  The team wants to thank the support we got from the entire Ruter organization, and especially the Data Science team.
92
 
93
  ___
94
+ # Llama 2 13b Chat Norwegian GPTQ (Norsk)
95
+ **Dette er en GPTQ versjon av Llama 2 13b Chat Norwegian modellen**
96
+ Llama-2-13b-chat-norwegian-GPQT er en versjon av [Meta](https://huggingface.co/meta-llama) sin [Llama 2 13b Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) model, finetuned på en kombinasjon av diverse norske datasett. Modellen ble laget i [Ruter AI Lab](https://ruter.no) 2023.
97
+
98
+ Les mer om [GPTQ her](https://towardsdatascience.com/4-bit-quantization-with-gptq-36b0f4f02c34).
99
+
100
+ For demo script, se [her](#demo-script).
101
+
102
+ Andre versjoner av modellen:
103
+
104
+ * [Llama-2-13b-chat-norwegian](https://huggingface.co/RuterNorway/Llama-2-13b-chat-norwegian)
105
+ * [Llama-2-13b-chat-norwegian-LoRa](https://huggingface.co/RuterNorway/Llama-2-13b-chat-norwegian-LoRa)
106
+ * [Llama-2-13b-chat-norwegian-GPTQ](https://huggingface.co/RuterNorway/Llama-2-13b-chat-norwegian-GPTQ)
107
+
108
+
109
+ Modellen er finetuned til å forstå og generere tekst på Norsk. Den er trent i én epoch med norwegian-alpaca + et utvalg av 15000 maskinoversatt data fra OpenOrca (datasett venter på utgivelse). Det består og av et lite sett med selvlagde instruksjonsdata.
110
 
111
+ **This is a GPTQ version of Llama 2 13b Norwegian**
112
+
113
+
114
+
115
+
116
+ Llama-2-13b-chat-norwegian is a variant of [Meta](https://huggingface.co/meta-llama)´s [Llama 2 13b Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) model, finetuned on a mix of norwegian datasets created in [Ruter AI Lab](https://ruter.no) the summer of 2023.
117
+
118
+ The model is tuned to understand and generate text in Norwegian. It's trained for one epoch on norwegian-alpaca + 15000 samples of machine-translated data from OpenOrca (the dataset to be released). A small subset of custom-made instructional data is also included.
119
 
 
120
 
121
 
122
  ## Data