Edit model card

Introduction

This project focuses on training a multi-label classification model and sequence to sequence model using South Korean lottery number data. This project's goal is to predict future lottery numbers based on historical draws. I utilize Python, PyTorch, and the Hugging Face Transformers library for this purpose.

Disclaimer: This project is intended purely for entertainment purposes. Lottery draws are independent events, and the outcomes of previous draws have no bearing on future ones. This project should not be taken as a serious attempt to predict lottery numbers. Users are advised to view this as a reference and not to rely on it for gambling decisions.

Additional Note: Decisions to purchase lottery tickets based on this project's output are solely the responsibility of the viewer. The creator of this project bears no responsibility for any gambling decisions made based on the information provided here.

If you would like to see more, please visit https://github.com/l-yohai/lotto for additional information.

bert_base_lotto

This model is a fine-tuned version of bert-base-cased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3487
  • Accuracy: 0.4133
  • Precision: 0.4133
  • Recall: 0.4133
  • F1: 0.4133

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
0.3663 1.0 18 0.3894 0.175 0.175 0.175 0.175
0.3938 2.0 36 0.3837 0.2417 0.2417 0.2417 0.2417
0.3925 3.0 54 0.3812 0.2683 0.2683 0.2683 0.2683
0.3892 4.0 72 0.3767 0.295 0.295 0.295 0.295
0.3768 5.0 90 0.3742 0.305 0.305 0.305 0.305
0.3852 6.0 108 0.3682 0.3317 0.3317 0.3317 0.3317
0.3747 7.0 126 0.3636 0.3583 0.3583 0.3583 0.3583
0.3881 8.0 144 0.3594 0.4 0.4 0.4 0.4000
0.3794 9.0 162 0.3550 0.4183 0.4183 0.4183 0.4183
0.3764 10.0 180 0.3487 0.4133 0.4133 0.4133 0.4133

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1

License

This project is licensed under the CC BY-NC 4.0 license.

Downloads last month
15
Safetensors
Model size
108M params
Tensor type
F32
·

Finetuned from