Edit model card

Training parameters

  • learning_rate: 1e-05
  • train_batch_size: 20
  • eval_batch_size: 32
  • optimizer:paged_adamw_32bit
  • num_epochs: 1

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
16
Safetensors
Model size
124M params
Tensor type
F32
·

Dataset used to train NatureUniverse/alpaca-gpt4

Space using NatureUniverse/alpaca-gpt4 1