base_model: openai/clip-vit-large-patch14
language: en
tags:
- vision
- zero-shot-classification
- plant-disease
- agriculture
- fine-tuned
datasets:
- custom
model-index:
- name: clip-vit-large-patch14-finetuned-disease
results: []
clip-vit-large-patch14-finetuned-disease
This model is a fine-tuned version of openai/clip-vit-large-patch14 on a custom dataset for plant disease captioning. It is designed to classify images of plant leaves and generate captions describing the disease or health condition of the leaves.
Model Description
The clip-vit-large-patch14-finetuned-disease
model has been fine-tuned on a dataset specifically curated to identify various diseases affecting plant leaves. This model uses the CLIP architecture to map images of leaves to descriptive captions, helping in the diagnosis and classification of plant diseases.
Labels and Descriptions
The model is trained to classify the following plant diseases and conditions:
{
0: "Apple leaf with Apple scab",
1: "Apple leaf with Black rot",
2: "Apple leaf with Cedar apple rust",
3: "Healthy Apple leaf",
4: "Corn leaf with Cercospora leaf spot (Gray leaf spot)",
5: "Corn leaf with Common rust",
6: "Corn leaf with Northern Leaf Blight",
7: "Healthy Corn leaf",
8: "Durian leaf with Algal Leaf Spot",
9: "Durian leaf with Leaf Blight",
10: "Durian leaf with Leaf Spot",
11: "Healthy Durian leaf",
12: "Grape leaf with Black rot",
13: "Grape leaf with Esca (Black Measles)",
14: "Grape leaf with Leaf blight (Isariopsis Leaf Spot)",
15: "Healthy Grape leaf",
16: "Oil Palm leaf with brown spots",
17: "Healthy Oil Palm leaf",
18: "Oil Palm leaf with white scale",
19: "Orange leaf with Huanglongbing (Citrus greening)",
20: "Pepper bell leaf with Bacterial spot",
21: "Healthy Pepper bell leaf",
22: "Potato leaf with Early blight",
23: "Potato leaf with Late blight",
24: "Healthy Potato leaf",
25: "Rice leaf with Bacterial blight",
26: "Rice leaf with Blast",
27: "Rice leaf with Brown spot",
28: "Rice leaf with Tungro",
29: "Healthy Soybean leaf",
30: "Strawberry leaf with Leaf scorch",
31: "Healthy Strawberry leaf",
32: "Tomato leaf with Bacterial spot",
33: "Tomato leaf with Early blight",
34: "Tomato leaf with Late blight",
35: "Tomato leaf with Leaf Mold",
36: "Tomato leaf with Septoria leaf spot",
37: "Tomato leaf with Spider mites (Two-spotted spider mite)",
38: "Tomato leaf with Target Spot",
39: "Tomato leaf with Tomato Yellow Leaf Curl Virus",
40: "Tomato leaf with Tomato mosaic virus",
41: "Healthy Tomato leaf"
}
Usage
You can use the clip-vit-large-patch14-finetuned-disease
model to classify images of plant leaves and generate captions describing their health condition or any disease present. Below is an example of how you can use this model in Python using the Hugging Face Transformers library:
from transformers import CLIPProcessor, CLIPModel
from PIL import Image
import requests
# Load the model and processor
model = CLIPModel.from_pretrained("Keetawan/clip-vit-large-patch14-plant-disease-finetuned")
processor = CLIPProcessor.from_pretrained("Keetawan/clip-vit-large-patch14-plant-disease-finetuned")
# Load an image of a plant leaf
image_url = "https://example.com/path_to_your_image.jpg"
image = Image.open(requests.get(image_url, stream=True).raw)
# Prepare the image
inputs = processor(text=["Apple leaf with Apple scab", "Healthy Tomato leaf", ...], images=image, return_tensors="pt", padding=True)
# Get predictions
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # Image-text similarity score
probs = logits_per_image.softmax(dim=1) # Convert logits to probabilities
# Print the most likely label
predicted_label = probs.argmax().item()
labels = [
"Apple leaf with Apple scab",
"Apple leaf with Black rot",
...
"Healthy Tomato leaf"
]
print(f"Predicted label: {labels[predicted_label]}")