Edit model card

Model Purpose

Classify whether there is cancer or not in CT-scan images of the lungs.

Model Details

This model is trained on 224X224 Grayscale images which were originally CT-scans that were transformed into JPG images. The model is a finetuned version of Swin Transformer (tiny-sized model).

Uses

The model can be used to classify JPG images of CT-scans into either cancer positive or Cancer negative groups. I think it would work okay for any image classification task.

Training Data

The model was trained on data originally obtained from the National Cancer Institute Imaging Data Commons. https://portal.imaging.datacommons.cancer.gov/explore/ Specifically data from the National Lung Screening trial. The data set used consisted of about 11,000 images which were transformed CT scans some of which contained Cancerous Nodules and some that did not.

How to Use

Upload a grayscale JPG into the model inference section and it will cast a prediction. Some are included in this repo. If the image contains an X, it is a negative cancer image. If an image name contains a Y it is positive.

Additionaly you can download the model.


from huggingface_hub import hf_hub_download
from PIL import Image

abc= hf_hub_download(repo_id="oohtmeel/swin-tiny-patch4-finetuned-lung-cancer-ct-scans", 
                filename="_X000a109d-56da-4c3f-8680-55afa04d6ae0.dcm.jpg.jpg")
image = Image.open(abc)
processor = AutoImageProcessor.from_pretrained("oohtmeel/swin-tiny-patch4-finetuned-lung-cancer-ct-scans")
model = AutoModelForImageClassification.from_pretrained("oohtmeel/swin-tiny-patch4-finetuned-lung-cancer-ct-scans")
inputs = processor(images=image, return_tensors="pt")
outputs = model(**inputs)

Results

{'test_accuracy': 0.8852380952380952}
Downloads last month
6
Safetensors
Model size
27.5M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.