File size: 539 Bytes
3345c7b
 
 
 
7974229
 
a3b0f80
7974229
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
tags:
- summarization
- bart
language:
- ro
inference: false
---

This is a pretrained-from-scratch **BART base** model (**140M** parameters).

Training was performed on a clean **50GB Romanian** text corpus for 3M steps with these [scripts](https://github.com/cosmoquester/transformers-bart-pretrain). The model was trained with a maximum sequence length of **1024**. 

**!! IMPORTANT !!** This model was pretrained on the text corruption task, meaning this model is **not usable** in any downstream task **without finetuning** first!