Text Generation
Transformers
Safetensors
Finnish
English
bloom
Inference Endpoints
text-generation-inference
jonabur commited on
Commit
e9441f3
1 Parent(s): 58b5f18

update README for final release

Browse files
Files changed (1) hide show
  1. README.md +6 -9
README.md CHANGED
@@ -15,9 +15,7 @@ language:
15
 
16
  # Poro 34B Model Card
17
 
18
- _**NOTE:** This is a **research checkpoint** of a model for which **training has not been completed.** It is being provided in its current state for research and testing purposes. **Care should be taken when using the outputs of the model.** Once pretraining has completed we intend to release additional instruction-tuned and chat-tuned varieties._
19
-
20
- Poro is a 34B parameter decoder-only transformer pretrained on Finnish, English and code. It is being trained on 1 trillion tokens (700 billion as of this release). Poro is a fully open source model and is made available under the Apache 2.0 License.
21
 
22
  Poro was created in a collaboration between [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/), the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
23
 
@@ -52,6 +50,9 @@ Checkpoints are available as branches in the repository. Checkpoints will be re
52
  * [500B](https://huggingface.co/LumiOpen/Poro-34B/tree/500B)
53
  * [600B](https://huggingface.co/LumiOpen/Poro-34B/tree/600B)
54
  * [700B](https://huggingface.co/LumiOpen/Poro-34B/tree/700B)
 
 
 
55
 
56
  The transformers library allows you to load a checkpoint from a branch as follows:
57
 
@@ -112,16 +113,12 @@ The Finnish dataset is a combination of many Finnish resources:
112
 
113
  ## Evaluation Results
114
 
115
- Despite the early training stage, Poro already exceeds the performance of the Finnish-only [FinGPT](https://turkunlp.org/gpt3-finnish) language models on the [FIN-bench](https://github.com/TurkuNLP/FIN-bench) Finnish language benchmark.
116
-
117
- Full evaluation results will be published with the final model.
118
 
119
  ## Ethical Considerations and Limitations
120
 
121
- _Poro 34B is a release of a partially trained model, and special care should be taken when using any output._
122
-
123
  Poro is an advanced language model, primarily optimized for English, Finnish and code, with no meaningful proficiency in any other languages. As with most AI-driven systems, Poro is a product of the vast data it has been trained on, which may reflect the imperfections, biases, and idiosyncrasies of the wider web. Poro may, at times, produce outputs that can be considered inaccurate, prejudiced, or controversial. Users and developers engaging with Poro should exercise discretion and consider additional evaluation and customization to ensure the model's responses align with their specific needs and ethical standards.
124
 
125
  ## License
126
 
127
- Poro is released under the Apache 2.0 license.
 
15
 
16
  # Poro 34B Model Card
17
 
18
+ Poro is a 34B parameter decoder-only transformer pretrained on Finnish, English and code. It is being trained on 1 trillion tokens. Poro is a fully open source model and is made available under the Apache 2.0 License.
 
 
19
 
20
  Poro was created in a collaboration between [SiloGen](https://www.silo.ai/silogen) from [Silo AI](https://www.silo.ai/), the [TurkuNLP group](https://turkunlp.org/) of the University of Turku, and [High Performance Language Technologies](https://hplt-project.org/) (HPLT). Training was conducted on the [LUMI supercomputer](https://www.lumi-supercomputer.eu/), using compute resources generously provided by [CSC](https://csc.fi/) - IT Center for Science, Finland.
21
 
 
50
  * [500B](https://huggingface.co/LumiOpen/Poro-34B/tree/500B)
51
  * [600B](https://huggingface.co/LumiOpen/Poro-34B/tree/600B)
52
  * [700B](https://huggingface.co/LumiOpen/Poro-34B/tree/700B)
53
+ * [800B](https://huggingface.co/LumiOpen/Poro-34B/tree/800B)
54
+ * [900B](https://huggingface.co/LumiOpen/Poro-34B/tree/900B)
55
+ * [1000B](https://huggingface.co/LumiOpen/Poro-34B/tree/1000B)
56
 
57
  The transformers library allows you to load a checkpoint from a branch as follows:
58
 
 
113
 
114
  ## Evaluation Results
115
 
116
+ Full evaluation results will be published soon.
 
 
117
 
118
  ## Ethical Considerations and Limitations
119
 
 
 
120
  Poro is an advanced language model, primarily optimized for English, Finnish and code, with no meaningful proficiency in any other languages. As with most AI-driven systems, Poro is a product of the vast data it has been trained on, which may reflect the imperfections, biases, and idiosyncrasies of the wider web. Poro may, at times, produce outputs that can be considered inaccurate, prejudiced, or controversial. Users and developers engaging with Poro should exercise discretion and consider additional evaluation and customization to ensure the model's responses align with their specific needs and ethical standards.
121
 
122
  ## License
123
 
124
+ Poro is released under the Apache 2.0 license.