Edit model card

Danish ELECTRA small (cased)

An ELECTRA model pretrained on a custom Danish corpus (~17.5gb). For details regarding data sources and training procedure, along with benchmarks on downstream tasks, go to: https://github.com/sarnikowski/danish_transformers/tree/main/electra

Usage

from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("sarnikowski/electra-small-generator-da-256-cased")
model = AutoModel.from_pretrained("sarnikowski/electra-small-generator-da-256-cased")

Questions?

If you have any questions feel free to open an issue in the danish_transformers repository, or send an email to p.sarnikowski@gmail.com

Downloads last month
24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.