File size: 395 Bytes
29a5832
 
 
2c5669c
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# Work in Progress Polish RoBERTa 

The model has been trained for about 5% time of the target. We will publish new increments as they will be trained. 

The model pre-trained on KGR10 corpora.

More about model at [CLARIN-dspace](https://huggingface.co/clarin/roberta-polish-v1)

## Usage

## Huggingface model hub

## Acknowledgments

[CLARIN-PL and CLARIN-BIZ project](https://clarin-pl.eu/)