Text Generation
Transformers
Safetensors
Czech
mpt
custom_code
text-generation-inference
Inference Endpoints
mfajcik commited on
Commit
6668b28
1 Parent(s): 50b0998

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -2,7 +2,8 @@
2
  license: apache-2.0
3
  ---
4
  # Introduction
5
- CSMPT7b is a large Czech language model continously pretrained for 272b training steps from English [MPT7b](https://huggingface.co/mosaicml/mpt-7b) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
 
6
 
7
  # Evaluation
8
  Dev eval at CS-HellaSwag (automatically translated HellaSwag benchmark).
 
2
  license: apache-2.0
3
  ---
4
  # Introduction
5
+ CSMPT7b is a large Czech language model continously pretrained for 272b training steps from English [MPT7b](https://huggingface.co/mosaicml/mpt-7b) model. Model was pretrained on ~67b token [Large Czech Collection](https://huggingface.co/datasets/BUT-FIT/but_lcc) using Czech tokenizer, obtained using our vocabulary swap method (see below).
6
+ Training was done on [Karolina](https://www.it4i.cz/en) cluster.
7
 
8
  # Evaluation
9
  Dev eval at CS-HellaSwag (automatically translated HellaSwag benchmark).