davanstrien's picture
davanstrien HF staff
Update README.md
fce50e7
|
raw
history blame
2.93 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - biglam/nls_chapbook_illustrations
model-index:
  - name: detr-resnet-50_fine_tuned_nls_chapbooks
    results: []
widget:
  - src: >-
      https://huggingface.co/davanstrien/detr-resnet-50_fine_tuned_nls_chapbooks/resolve/main/Chapbook_Jack_the_Giant_Killer.jpg
    example_title: Jack the Giant Killer
  - src: >-
      https://huggingface.co/davanstrien/detr-resnet-50_fine_tuned_nls_chapbooks/resolve/main/PN970_G6_V3_1846_DUP_0011.jpg
    example_title: History of Valentine and Orson

detr-resnet-50_fine_tuned_nls_chapbooks

This model is a fine-tuned version of facebook/detr-resnet-50 on the biglam/nls_chapbook_illustrations dataset. This dataset contains images of chapbooks with bounding boxes for the illustrations contained on some of the pages.

Model description

More information needed

Intended uses & limitations

Using in a transformer pipeline

The easiest way to use this model is via a Transformers pipeline. To do this, you should first load the model and feature extractor:

from transformers import AutoFeatureExtractor, AutoModelForObjectDetection

extractor = AutoFeatureExtractor.from_pretrained("davanstrien/detr-resnet-50_fine_tuned_nls_chapbooks")

model = AutoModelForObjectDetection.from_pretrained("davanstrien/detr-resnet-50_fine_tuned_nls_chapbooks")

Then you can create a pipeline for object detection using the model.

from transformers import pipeline

pipe = pipeline('object-detection',model=model, feature_extractor=extractor)

To use this to make predictions pass in an image (or a file-path/URL for the image):

>>> pipe("https://huggingface.co/davanstrien/detr-resnet-50_fine_tuned_nls_chapbooks/resolve/main/Chapbook_Jack_the_Giant_Killer.jpg")
[{'box': {'xmax': 290, 'xmin': 70, 'ymax': 510, 'ymin': 261},
  'label': 'early_printed_illustration',
  'score': 0.998455286026001}]

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Framework versions

  • Transformers 4.20.1
  • Pytorch 1.12.0+cu113
  • Datasets 2.3.2
  • Tokenizers 0.12.1

Example image credits

https://commons.wikimedia.org/wiki/File:Chapbook_Jack_the_Giant_Killer.jpg https://archive.org/details/McGillLibrary-PN970_G6_V3_1846-1180/