amyeroberts's picture
amyeroberts HF staff
update model card README.md
8862926
|
raw
history blame
3.83 kB
metadata
license: apache-2.0
base_model: facebook/detr-resnet-50
tags:
  - generated_from_trainer
model-index:
  - name: detr-resnet-50-base-coco
    results: []

detr-resnet-50-base-coco

This model is a fine-tuned version of facebook/detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 6388.8809

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50.0

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 4 6398.4209
No log 2.0 8 6397.8584
6370.2805 3.0 12 6397.4790
6370.2805 4.0 16 6396.8320
6424.3547 5.0 20 6396.2930
6424.3547 6.0 24 6395.7231
6424.3547 7.0 28 6395.1377
6477.4051 8.0 32 6394.8188
6477.4051 9.0 36 6394.4785
6381.9848 10.0 40 6394.2212
6381.9848 11.0 44 6394.0664
6381.9848 12.0 48 6393.9253
6343.784 13.0 52 6393.7344
6343.784 14.0 56 6393.6128
6458.8668 15.0 60 6393.4380
6458.8668 16.0 64 6393.2393
6458.8668 17.0 68 6393.1040
6414.077 18.0 72 6392.9575
6414.077 19.0 76 6392.8301
6417.8516 20.0 80 6392.3057
6417.8516 21.0 84 6391.8311
6417.8516 22.0 88 6391.5532
6333.3547 23.0 92 6391.3403
6333.3547 24.0 96 6391.1934
6455.1539 25.0 100 6390.9741
6455.1539 26.0 104 6390.8228
6455.1539 27.0 108 6390.7607
6399.7898 28.0 112 6390.6655
6399.7898 29.0 116 6390.5859
6410.9336 30.0 120 6390.4907
6410.9336 31.0 124 6390.3389
6410.9336 32.0 128 6390.1978
6409.2 33.0 132 6390.0342
6409.2 34.0 136 6389.9624
6406.6211 35.0 140 6389.9111
6406.6211 36.0 144 6389.6875
6406.6211 37.0 148 6389.4756
6371.1539 38.0 152 6389.3516
6371.1539 39.0 156 6389.2695
6409.1055 40.0 160 6389.2495
6409.1055 41.0 164 6389.2090
6409.1055 42.0 168 6389.1099
6453.5285 43.0 172 6389.0405
6453.5285 44.0 176 6388.9937
6391.1004 45.0 180 6388.9541
6391.1004 46.0 184 6388.9238
6391.1004 47.0 188 6388.9077
6416.6641 48.0 192 6388.8911
6416.6641 49.0 196 6388.8828
6397.6828 50.0 200 6388.8809

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.0+cu117
  • Datasets 2.13.1
  • Tokenizers 0.13.3