GysBERT / README.md
emanjavacas's picture
Update README.md
caeb89f
metadata
license: mit
language:
  - nl

GysBERT v1

This model is a Historical Language Model for Dutch coming from the MacBERTh project.

The architecture is based on BERT base uncased from the original BERT pre-training codebase. The training material comes mostly from the DBNL and the Delpher newspaper dump. The details can be found in the accompanying publication: Non-Parametric Word Sense Disambiguation for Historical Languages

The model has been successfully tested on Word Sense Disambiguation tasks as discussed in the referenced paper above.

An updated version with an enlarged pre-training dataset is due soon.