Danish ELECTRA small (cased)

An ELECTRA model pretrained on a custom Danish corpus (~17.5gb). For details regarding data sources and training procedure, along with benchmarks on downstream tasks, go to: https://github.com/sarnikowski/danish_transformers/tree/main/electra

Usage

from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("sarnikowski/electra-small-generator-da-256-cased")
model = AutoModel.from_pretrained("sarnikowski/electra-small-generator-da-256-cased")

Questions?

If you have any questions feel free to open an issue in the danish_transformers repository, or send an email to p.sarnikowski@gmail.com

New

Select AutoNLP in the “Train” menu to fine-tune this model automatically.

Downloads last month
854
Hosted inference API
Fill-Mask
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.