Update README.md
Browse files
README.md
CHANGED
@@ -2,4 +2,12 @@
|
|
2 |
tags:
|
3 |
- summarization
|
4 |
- bart
|
5 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
tags:
|
3 |
- summarization
|
4 |
- bart
|
5 |
+
language:
|
6 |
+
- ro
|
7 |
+
---
|
8 |
+
|
9 |
+
This is a pretrained-from-scratch **BART base** model (**140M** parameters).
|
10 |
+
|
11 |
+
Training was performed on a clean **50GB Romanian** text corpus for 3M steps with these [scripts](https://github.com/cosmoquester/transformers-bart-pretrain). The model was trained with a maximum sequence length of **1024**.
|
12 |
+
|
13 |
+
**!! IMPORTANT !!** This model was pretrained on the text corruption task, meaning this model is **not usable** in any downstream task **without finetuning** first!
|