Text Generation
Transformers
Safetensors
Finnish
English
bloom
Inference Endpoints
text-generation-inference
jonabur commited on
Commit
c251a8a
1 Parent(s): a0340bb

update for 500B release

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -14,7 +14,7 @@ datasets:
14
 
15
  _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
16
 
17
- Poro is a 34B parameter decoder-only transformer pretrained on Finnish, English and code. It is being trained on 1 trillion tokens (300 billion as of this release). Poro is a fully open source model and is made available under the Apache 2.0 License.
18
 
19
  Poro was created in a collaboration between [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/), the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
20
 
@@ -45,6 +45,8 @@ Checkpoints are available as branches in the repository. Checkpoints will be re
45
  * [100B](https://huggingface.co/LumiOpen/Poro-34B/tree/100B)
46
  * [200B](https://huggingface.co/LumiOpen/Poro-34B/tree/200B)
47
  * [300B](https://huggingface.co/LumiOpen/Poro-34B/tree/300B)
 
 
48
 
49
  The transformers library allows you to load a checkpoint from a branch as follows:
50
 
 
14
 
15
  _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
16
 
17
+ Poro is a 34B parameter decoder-only transformer pretrained on Finnish, English and code. It is being trained on 1 trillion tokens (500 billion as of this release). Poro is a fully open source model and is made available under the Apache 2.0 License.
18
 
19
  Poro was created in a collaboration between [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/), the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
20
 
 
45
  * [100B](https://huggingface.co/LumiOpen/Poro-34B/tree/100B)
46
  * [200B](https://huggingface.co/LumiOpen/Poro-34B/tree/200B)
47
  * [300B](https://huggingface.co/LumiOpen/Poro-34B/tree/300B)
48
+ * [400B](https://huggingface.co/LumiOpen/Poro-34B/tree/400B)
49
+ * [500B](https://huggingface.co/LumiOpen/Poro-34B/tree/500B)
50
 
51
  The transformers library allows you to load a checkpoint from a branch as follows:
52