hubert-medium-wiki

This model was trained from scratch on the Wikipedia subset of Hungarian Webcorpus 2.0 with MLM and SOP tasks.

Pre-Training Parameters:

First phase:

  • Training steps: 500.000
  • Sequence length: 128
  • Batch size: 1024

Second phase:

  • Training steps: 100.000
  • Sequence length: 512
  • Batch size: 384

Framework versions

  • Transformers 4.21.3
  • TensorFlow 2.10.0
  • Datasets 2.4.0
  • Tokenizers 0.12.1

Acknowledgement

Artificial Intelligence - National Laboratory - Hungary

Downloads last month
11
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train SzegedAI/hubertusz-medium-wiki