Edit model card

mario-semantic-1

This model is a fine-tuned version of nvidia/mit-b0 on the Custom mario Dataset dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0721
  • Mean Iou: 0.0
  • Mean Accuracy: 0.0
  • Overall Accuracy: 0.0
  • Accuracy Unlabeled: nan
  • Accuracy Mario: 0.0
  • Accuracy Ground: 0.0
  • Accuracy Enemy: 0.0
  • Accuracy Bricks: 0.0
  • Accuracy Question: 0.0
  • Iou Unlabeled: 0.0
  • Iou Mario: 0.0
  • Iou Ground: 0.0
  • Iou Enemy: 0.0
  • Iou Bricks: 0.0
  • Iou Question: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Mario Accuracy Ground Accuracy Enemy Accuracy Bricks Accuracy Question Iou Unlabeled Iou Mario Iou Ground Iou Enemy Iou Bricks Iou Question
1.1471 0.2222 10 1.3150 0.0054 0.0409 0.0429 nan 0.0587 0.0 0.0305 0.0481 0.0674 0.0 0.0141 0.0 0.0110 0.0010 0.0063
1.0399 0.4444 20 1.1597 0.0042 0.0247 0.0335 nan 0.0687 0.0 0.0054 0.0098 0.0397 0.0 0.0136 0.0 0.0029 0.0005 0.0081
0.8368 0.6667 30 0.9484 0.0018 0.0052 0.0054 nan 0.0024 0.0 0.0098 0.0018 0.0121 0.0 0.0012 0.0 0.0049 0.0002 0.0046
0.9264 0.8889 40 0.7115 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.7753 1.1111 50 0.7572 0.0010 0.0023 0.0038 nan 0.0 0.0 0.0113 0.0 0.0 0.0 0.0 0.0 0.0062 0.0 0.0
0.6295 1.3333 60 0.5617 0.0001 0.0002 0.0003 nan 0.0 0.0 0.0009 0.0 0.0 0.0 0.0 0.0 0.0009 0.0 0.0
0.5956 1.5556 70 0.4135 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.5756 1.7778 80 0.2028 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.5318 2.0 90 0.1185 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.5351 2.2222 100 0.3064 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.5706 2.4444 110 0.1378 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.4863 2.6667 120 0.1121 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.3226 2.8889 130 0.2038 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.4139 3.1111 140 0.1520 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.3983 3.3333 150 0.1070 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.3672 3.5556 160 0.1282 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.3324 3.7778 170 0.1075 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.2806 4.0 180 0.2677 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.2854 4.2222 190 0.1020 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.3463 4.4444 200 0.0551 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1957 4.6667 210 0.1982 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.3063 4.8889 220 0.0962 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1933 5.1111 230 0.1172 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1833 5.3333 240 0.0600 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.231 5.5556 250 0.0519 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1516 5.7778 260 0.0575 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.172 6.0 270 0.1182 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1307 6.2222 280 0.0989 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1454 6.4444 290 0.1045 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1319 6.6667 300 0.0793 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1154 6.8889 310 0.0567 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1241 7.1111 320 0.0562 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1379 7.3333 330 0.0700 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1183 7.5556 340 0.0616 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.108 7.7778 350 0.0823 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1204 8.0 360 0.0661 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1391 8.2222 370 0.0578 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1554 8.4444 380 0.0643 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1338 8.6667 390 0.0822 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1358 8.8889 400 0.0997 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1704 9.1111 410 0.0503 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1242 9.3333 420 0.0692 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1153 9.5556 430 0.1003 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0999 9.7778 440 0.0909 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0968 10.0 450 0.0721 0.0 0.0 0.0 nan 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.3.0
  • Datasets 2.19.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
3.72M params
Tensor type
F32
·

Finetuned from