nb-bert-base / README.md
pere's picture
Update README.md
015b89c
---
language: no
license: CC-BY 4.0
pipeline_tag: fill-mask
tags:
- norwegian
- bert
thumbnail: nblogo_3.png
---
**Release 1.0** (January 13, 2021)
# NB-Bert
## Description
NB-Bert is a general Bert-base model built on the large digital collection at the National Library of Norway.
This model is based on the same structure as [BERT Cased multilingual model](https://github.com/google-research/bert/blob/master/multilingual.md), and is trained on a wide variety of Norwegian text (both bokmål and nynorsk) from the last 200 years.
## Intended use & limitations
The 1.0 version of the model is general, and should be fine-tuned for any particular use. Some fine-tuning sets may be found on Github, see
* https://github.com/NBAiLab/notram
## Training data
The model is trained on a wide variety of text. The training set is described on
* https://github.com/NBAiLab/notram
## More information
For more information on the model, see
https://github.com/NBAiLab/notram