detr-v8 / README.md
pallavJha's picture
init
df2949c
metadata
license: apache-2.0
base_model: facebook/detr-resnet-50
tags:
  - generated_from_trainer
model-index:
  - name: detr-V8
    results: []

detr-V8

This model is a fine-tuned version of facebook/detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2139

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss
No log 0.48 1000 0.3770
No log 0.96 2000 0.3967
0.4391 1.43 3000 0.3822
0.4391 1.91 4000 0.4163
0.4434 2.39 5000 0.3888
0.4434 2.87 6000 0.3867
0.4509 3.35 7000 0.4205
0.4509 3.83 8000 0.4014
0.455 4.3 9000 0.4117
0.455 4.78 10000 0.3964
0.4476 5.26 11000 0.3915
0.4476 5.74 12000 0.3919
0.444 6.22 13000 0.4026
0.444 6.7 14000 0.3832
0.443 7.17 15000 0.4057
0.443 7.65 16000 0.3677
0.4232 8.13 17000 0.3746
0.4232 8.61 18000 0.3672
0.4202 9.09 19000 0.3629
0.4202 9.56 20000 0.3739
0.4131 10.04 21000 0.3712
0.4131 10.52 22000 0.3470
0.4131 11.0 23000 0.3632
0.4024 11.48 24000 0.3561
0.4024 11.96 25000 0.3562
0.4013 12.43 26000 0.3253
0.4013 12.91 27000 0.3390
0.3925 13.39 28000 0.3398
0.3925 13.87 29000 0.3460
0.3804 14.35 30000 0.3338
0.3804 14.83 31000 0.3201
0.3757 15.3 32000 0.3119
0.3757 15.78 33000 0.3106
0.3663 16.26 34000 0.3164
0.3663 16.74 35000 0.3190
0.3588 17.22 36000 0.3141
0.3588 17.69 37000 0.3262
0.3515 18.17 38000 0.3027
0.3515 18.65 39000 0.3178
0.3557 19.13 40000 0.3053
0.3557 19.61 41000 0.3032
0.3478 20.09 42000 0.3147
0.3478 20.56 43000 0.3069
0.3451 21.04 44000 0.3070
0.3451 21.52 45000 0.3055
0.3451 22.0 46000 0.2883
0.3367 22.48 47000 0.3090
0.3367 22.96 48000 0.2906
0.3348 23.43 49000 0.2805
0.3348 23.91 50000 0.2920
0.3298 24.39 51000 0.2854
0.3298 24.87 52000 0.2841
0.3254 25.35 53000 0.2822
0.3254 25.82 54000 0.2716
0.3169 26.3 55000 0.2825
0.3169 26.78 56000 0.2700
0.314 27.26 57000 0.2640
0.314 27.74 58000 0.2728
0.3047 28.22 59000 0.2654
0.3047 28.69 60000 0.2691
0.2999 29.17 61000 0.2601
0.2999 29.65 62000 0.2607
0.297 30.13 63000 0.2581
0.297 30.61 64000 0.2511
0.2946 31.09 65000 0.2557
0.2946 31.56 66000 0.2568
0.2912 32.04 67000 0.2569
0.2912 32.52 68000 0.2594
0.2912 33.0 69000 0.2553
0.2906 33.48 70000 0.2425
0.2906 33.96 71000 0.2475
0.2833 34.43 72000 0.2394
0.2833 34.91 73000 0.2422
0.278 35.39 74000 0.2403
0.278 35.87 75000 0.2349
0.2738 36.35 76000 0.2300
0.2738 36.82 77000 0.2332
0.2701 37.3 78000 0.2309
0.2701 37.78 79000 0.2298
0.2659 38.26 80000 0.2343
0.2659 38.74 81000 0.2265
0.2626 39.22 82000 0.2310
0.2626 39.69 83000 0.2255
0.259 40.17 84000 0.2263
0.259 40.65 85000 0.2282
0.2563 41.13 86000 0.2309
0.2563 41.61 87000 0.2270
0.2548 42.09 88000 0.2237
0.2548 42.56 89000 0.2203
0.254 43.04 90000 0.2204
0.254 43.52 91000 0.2218
0.254 44.0 92000 0.2207
0.2484 44.48 93000 0.2144
0.2484 44.95 94000 0.2194
0.2475 45.43 95000 0.2165
0.2475 45.91 96000 0.2162
0.2453 46.39 97000 0.2136
0.2453 46.87 98000 0.2152
0.2441 47.35 99000 0.2162
0.2441 47.82 100000 0.2171
0.2408 48.3 101000 0.2119
0.2408 48.78 102000 0.2131
0.2389 49.26 103000 0.2109
0.2389 49.74 104000 0.2139

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3