nb-bert-base / README.md
pere's picture
Update README.md
82b194c
metadata
language: 'no'
license: cc-by-4.0
tags:
  - norwegian
  - bert
pipeline_tag: fill-mask
widget:
  - text:  biblioteket kan du [MASK] en bok.
  - text: Dette er et [MASK] eksempel.
  - text: Av og til kan en språkmodell gi et [MASK] resultat.
  - text: >-
      Som ansat får du [MASK] for at bidrage til borgernes adgang til dansk
      kulturarv, til forskning og til samfundets demokratiske udvikling.
  • Release 1.1 (March 11, 2021)
  • Release 1.0 (January 13, 2021)

NB-BERT-base

Description

NB-BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway.

This model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text (both bokmål and nynorsk) from the last 200 years.

Intended use & limitations

The 1.1 version of the model is general, and should be fine-tuned for any particular use. Some fine-tuning sets may be found on GitHub, see

Training data

The model is trained on a wide variety of text. The training set is described on

More information

For more information on the model, see

https://github.com/NBAiLab/notram