Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
jonabur commited on
Commit
08c7366
1 Parent(s): fb7cbbc

update README

Browse files
Files changed (1) hide show
  1. README.md +8 -3
README.md CHANGED
@@ -17,8 +17,6 @@ language:
17
 
18
  # Viking 33B
19
 
20
- _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
21
-
22
  Viking 33B is a 33B parameter decoder-only transformer pretrained on Finnish,
23
  English, Swedish, Danish, Norwegian, Icelandic and code. It is being trained
24
  on 2 trillion tokens (1300B billion as of this release). Viking 33B is a fully open source model and is made available under the Apache 2.0 License.
@@ -39,7 +37,7 @@ Viking is the second set of models released by LumiOpen and is available at
39
  [Viking 33B](https://huggingface.co/LumiOpen/Viking-33B)
40
 
41
  ## Model Overview
42
- _**NOTE:** In addition to being an early research release, Viking is a base model which needs further fine tuning for most use cases._
43
 
44
  Viking is a generative pretrained transformer using a LLaMA-like GPT architecture, and makes use of rotary positional embeddings and flash attention.
45
 
@@ -100,6 +98,13 @@ Training checkpoints are available as branches in the repository. Checkpoints w
100
  * [1100B](https://huggingface.co/LumiOpen/Viking-33B/tree/1100B)
101
  * [1200B](https://huggingface.co/LumiOpen/Viking-33B/tree/1200B)
102
  * [1300B](https://huggingface.co/LumiOpen/Viking-33B/tree/1300B)
 
 
 
 
 
 
 
103
 
104
  The transformers library allows you to load a checkpoint from a branch as follows:
105
 
 
17
 
18
  # Viking 33B
19
 
 
 
20
  Viking 33B is a 33B parameter decoder-only transformer pretrained on Finnish,
21
  English, Swedish, Danish, Norwegian, Icelandic and code. It is being trained
22
  on 2 trillion tokens (1300B billion as of this release). Viking 33B is a fully open source model and is made available under the Apache 2.0 License.
 
37
  [Viking 33B](https://huggingface.co/LumiOpen/Viking-33B)
38
 
39
  ## Model Overview
40
+ _**NOTE:** Viking is a base model which needs further fine tuning for most use cases._
41
 
42
  Viking is a generative pretrained transformer using a LLaMA-like GPT architecture, and makes use of rotary positional embeddings and flash attention.
43
 
 
98
  * [1100B](https://huggingface.co/LumiOpen/Viking-33B/tree/1100B)
99
  * [1200B](https://huggingface.co/LumiOpen/Viking-33B/tree/1200B)
100
  * [1300B](https://huggingface.co/LumiOpen/Viking-33B/tree/1300B)
101
+ * [1400B](https://huggingface.co/LumiOpen/Viking-33B/tree/1400B)
102
+ * [1500B](https://huggingface.co/LumiOpen/Viking-33B/tree/1500B)
103
+ * [1600B](https://huggingface.co/LumiOpen/Viking-33B/tree/1600B)
104
+ * [1700B](https://huggingface.co/LumiOpen/Viking-33B/tree/1700B)
105
+ * [1800B](https://huggingface.co/LumiOpen/Viking-33B/tree/1800B)
106
+ * [1900B](https://huggingface.co/LumiOpen/Viking-33B/tree/1900B)
107
+ * [2000B](https://huggingface.co/LumiOpen/Viking-33B/tree/2000B)
108
 
109
  The transformers library allows you to load a checkpoint from a branch as follows:
110