Edit model card

Model Card for Model ID

This model is a fine-tuned version of microsoft/resnet-18 on the cifar10 dataset. It achieves the following results on the evaluation set:

  • Accuracy: 0.8061

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • optimizer: SGD with momentum = 0.9
  • num_epochs: 7

Training results

Epoch [1/7], Batch [100/1563], Loss: 2.3931 Epoch [1/7], Batch [200/1563], Loss: 1.3625 Epoch [1/7], Batch [300/1563], Loss: 1.1862 Epoch [1/7], Batch [400/1563], Loss: 1.1345 Epoch [1/7], Batch [500/1563], Loss: 1.0241 Epoch [1/7], Batch [600/1563], Loss: 0.9849 Epoch [1/7], Batch [700/1563], Loss: 0.9894 Epoch [1/7], Batch [800/1563], Loss: 0.9365 Epoch [1/7], Batch [900/1563], Loss: 0.9296 Epoch [1/7], Batch [1000/1563], Loss: 0.8903 Epoch [1/7], Batch [1100/1563], Loss: 0.8545 Epoch [1/7], Batch [1200/1563], Loss: 0.8403 Epoch [1/7], Batch [1300/1563], Loss: 0.8406 Epoch [1/7], Batch [1400/1563], Loss: 0.7841 Epoch [1/7], Batch [1500/1563], Loss: 0.8174 Epoch [1/7], Test Accuracy: 73.55% Epoch [2/7], Batch [100/1563], Loss: 0.6522 Epoch [2/7], Batch [200/1563], Loss: 0.6566 Epoch [2/7], Batch [300/1563], Loss: 0.6582 Epoch [2/7], Batch [400/1563], Loss: 0.7010 Epoch [2/7], Batch [500/1563], Loss: 0.6643 Epoch [2/7], Batch [600/1563], Loss: 0.6554 Epoch [2/7], Batch [700/1563], Loss: 0.6579 Epoch [2/7], Batch [800/1563], Loss: 0.6387 Epoch [2/7], Batch [900/1563], Loss: 0.6481 Epoch [2/7], Batch [1000/1563], Loss: 0.6986 Epoch [2/7], Batch [1100/1563], Loss: 0.6244 Epoch [2/7], Batch [1200/1563], Loss: 0.6116 Epoch [2/7], Batch [1300/1563], Loss: 0.6085 Epoch [2/7], Batch [1400/1563], Loss: 0.6320 Epoch [2/7], Batch [1500/1563], Loss: 0.6062 Epoch [2/7], Test Accuracy: 76.96% Epoch [3/7], Batch [100/1563], Loss: 0.5028 Epoch [3/7], Batch [200/1563], Loss: 0.4958 Epoch [3/7], Batch [300/1563], Loss: 0.4761 Epoch [3/7], Batch [400/1563], Loss: 0.4858 Epoch [3/7], Batch [500/1563], Loss: 0.5087 Epoch [3/7], Batch [600/1563], Loss: 0.5235 Epoch [3/7], Batch [700/1563], Loss: 0.5242 Epoch [3/7], Batch [800/1563], Loss: 0.4967 Epoch [3/7], Batch [900/1563], Loss: 0.5349 Epoch [3/7], Batch [1000/1563], Loss: 0.4925 Epoch [3/7], Batch [1100/1563], Loss: 0.5135 Epoch [3/7], Batch [1200/1563], Loss: 0.5407 Epoch [3/7], Batch [1300/1563], Loss: 0.5301 Epoch [3/7], Batch [1400/1563], Loss: 0.4891 Epoch [3/7], Batch [1500/1563], Loss: 0.5025 Epoch [3/7], Test Accuracy: 78.17% Epoch [4/7], Batch [100/1563], Loss: 0.4175 Epoch [4/7], Batch [200/1563], Loss: 0.4080 Epoch [4/7], Batch [300/1563], Loss: 0.3963 Epoch [4/7], Batch [400/1563], Loss: 0.3793 Epoch [4/7], Batch [500/1563], Loss: 0.4056 Epoch [4/7], Batch [600/1563], Loss: 0.4050 Epoch [4/7], Batch [700/1563], Loss: 0.4187 Epoch [4/7], Batch [800/1563], Loss: 0.4049 Epoch [4/7], Batch [900/1563], Loss: 0.4062 Epoch [4/7], Batch [1000/1563], Loss: 0.4429 Epoch [4/7], Batch [1100/1563], Loss: 0.4086 Epoch [4/7], Batch [1200/1563], Loss: 0.4671 Epoch [4/7], Batch [1300/1563], Loss: 0.3966 Epoch [4/7], Batch [1400/1563], Loss: 0.4148 Epoch [4/7], Batch [1500/1563], Loss: 0.4076 Epoch [4/7], Test Accuracy: 79.47% Epoch [5/7], Batch [100/1563], Loss: 0.3093 Epoch [5/7], Batch [200/1563], Loss: 0.2897 Epoch [5/7], Batch [300/1563], Loss: 0.3384 Epoch [5/7], Batch [400/1563], Loss: 0.3241 Epoch [5/7], Batch [500/1563], Loss: 0.3333 Epoch [5/7], Batch [600/1563], Loss: 0.3253 Epoch [5/7], Batch [700/1563], Loss: 0.3405 Epoch [5/7], Batch [800/1563], Loss: 0.3830 Epoch [5/7], Batch [900/1563], Loss: 0.3382 Epoch [5/7], Batch [1000/1563], Loss: 0.3354 Epoch [5/7], Batch [1100/1563], Loss: 0.3279 Epoch [5/7], Batch [1200/1563], Loss: 0.3300 Epoch [5/7], Batch [1300/1563], Loss: 0.3504 Epoch [5/7], Batch [1400/1563], Loss: 0.3543 Epoch [5/7], Batch [1500/1563], Loss: 0.3478 Epoch [5/7], Test Accuracy: 78.57% Epoch [6/7], Batch [100/1563], Loss: 0.2658 Epoch [6/7], Batch [200/1563], Loss: 0.2491 Epoch [6/7], Batch [300/1563], Loss: 0.2448 Epoch [6/7], Batch [400/1563], Loss: 0.2257 Epoch [6/7], Batch [500/1563], Loss: 0.2567 Epoch [6/7], Batch [600/1563], Loss: 0.2502 Epoch [6/7], Batch [700/1563], Loss: 0.2697 Epoch [6/7], Batch [800/1563], Loss: 0.2783 Epoch [6/7], Batch [900/1563], Loss: 0.2748 Epoch [6/7], Batch [1000/1563], Loss: 0.2917 Epoch [6/7], Batch [1100/1563], Loss: 0.2945 Epoch [6/7], Batch [1200/1563], Loss: 0.2940 Epoch [6/7], Batch [1300/1563], Loss: 0.2975 Epoch [6/7], Batch [1400/1563], Loss: 0.2797 Epoch [6/7], Batch [1500/1563], Loss: 0.2926 Epoch [6/7], Test Accuracy: 80.09% Epoch [7/7], Batch [100/1563], Loss: 0.2036 Epoch [7/7], Batch [200/1563], Loss: 0.2021 Epoch [7/7], Batch [300/1563], Loss: 0.1882 Epoch [7/7], Batch [400/1563], Loss: 0.2069 Epoch [7/7], Batch [500/1563], Loss: 0.2181 Epoch [7/7], Batch [600/1563], Loss: 0.2305 Epoch [7/7], Batch [700/1563], Loss: 0.2423 Epoch [7/7], Batch [800/1563], Loss: 0.2296 Epoch [7/7], Batch [900/1563], Loss: 0.2426 Epoch [7/7], Batch [1000/1563], Loss: 0.2573 Epoch [7/7], Batch [1100/1563], Loss: 0.2557 Epoch [7/7], Batch [1200/1563], Loss: 0.2294 Epoch [7/7], Batch [1300/1563], Loss: 0.2266 Epoch [7/7], Batch [1400/1563], Loss: 0.2631 Epoch [7/7], Batch [1500/1563], Loss: 0.2518 Epoch [7/7], Test Accuracy: 79.59%

Framework versions

  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
Downloads last month
0
Inference API
Drag image file here or click to browse from your device
Unable to determine this model's library. Check the docs .

Dataset used to train amtsal/resnet18_finetunewithcifar10