Edit model card
  • Release 1.0beta (April 29, 2021)

NB-BERT-large (beta)

Description

NB-BERT-large is a general BERT-large model built on the large digital collection at the National Library of Norway.

This model is trained from scratch on a wide variety of Norwegian text (both bokmål and nynorsk) from the last 200 years using a monolingual Norwegian vocabulary.

Intended use & limitations

The 1.0 version of the model is general, and should be fine-tuned for any particular use. Some fine-tuning sets may be found on Github, see

Training data

The model is trained on a wide variety of text. The training set is described on

More information

For more information on the model, see

https://github.com/NBAiLab/notram

Downloads last month
265
Safetensors
Model size
356M params
Tensor type
I64
·
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using NbAiLab/nb-bert-large 3

Collection including NbAiLab/nb-bert-large