Swe Roberta Wiki Oscar

Description

This Roberta model was trained on the Swedish Wikipedia and Oscar datasets

Model series

This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.

Gpt models

Swedish Gpt

https://huggingface.co/birgermoell/swedish-gpt/

Swedish gpt wiki

https://huggingface.co/flax-community/swe-gpt-wiki

Nordic gpt wiki

https://huggingface.co/flax-community/nordic-gpt-wiki

Dansk gpt wiki

https://huggingface.co/flax-community/dansk-gpt-wiki

Norsk gpt wiki

https://huggingface.co/flax-community/norsk-gpt-wiki

Roberta models

Nordic Roberta Wiki

https://huggingface.co/flax-community/nordic-roberta-wiki

Swe Roberta Wiki Oscar

https://huggingface.co/flax-community/swe-roberta-wiki-oscar

Roberta Swedish Scandi

https://huggingface.co/birgermoell/roberta-swedish-scandi

Roberta Swedish

https://huggingface.co/birgermoell/roberta-swedish

Swedish T5 model

https://huggingface.co/birgermoell/t5-base-swedish

New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
154
Hosted inference API
Fill-Mask
Mask token: <mask>
This model can be loaded on the Inference API on-demand.