Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Roberta-Base

This repo trains roberta-base from scratch on the Norwegian training subset of Oscar containing roughly 4.7 GB of data according to this example.

Training is done on a TPUv3-8 in Flax. More statistics on the training run can be found under tf.hub.

Downloads last month
8