EduFalcao commited on
Commit
9917636
·
verified ·
1 Parent(s): 4881209

Delete .ipynb_checkpoints

Browse files
.ipynb_checkpoints/README-checkpoint.md DELETED
@@ -1,122 +0,0 @@
1
- ---
2
- base_model: openai/clip-vit-large-patch14
3
- language: en
4
- tags:
5
- - vision
6
- - zero-shot-classification
7
- - plant-disease
8
- - agriculture
9
- - fine-tuned
10
- datasets:
11
- - custom
12
- model-index:
13
- - name: clip-vit-large-patch14-finetuned-disease
14
- results: []
15
- ---
16
-
17
- # clip-vit-large-patch14-finetuned-disease
18
-
19
- This model is a fine-tuned version of [openai/clip-vit-large-patch14](https://huggingface.co/openai/clip-vit-large-patch14) on a custom dataset for plant disease captioning. It is designed to classify images of plant leaves and generate captions describing the disease or health condition of the leaves.
20
-
21
- ## Model Description
22
-
23
- The `clip-vit-large-patch14-finetuned-disease` model has been fine-tuned on a dataset specifically curated to identify various diseases affecting plant leaves. This model uses the CLIP architecture to map images of leaves to descriptive captions, helping in the diagnosis and classification of plant diseases.
24
-
25
- ### Labels and Descriptions
26
-
27
- The model is trained to classify the following plant diseases and conditions:
28
-
29
- ```json
30
- {
31
- 0: "Apple leaf with Apple scab",
32
- 1: "Apple leaf with Black rot",
33
- 2: "Apple leaf with Cedar apple rust",
34
- 3: "Healthy Apple leaf",
35
- 4: "Corn leaf with Cercospora leaf spot (Gray leaf spot)",
36
- 5: "Corn leaf with Common rust",
37
- 6: "Corn leaf with Northern Leaf Blight",
38
- 7: "Healthy Corn leaf",
39
- 8: "Durian leaf with Algal Leaf Spot",
40
- 9: "Durian leaf with Leaf Blight",
41
- 10: "Durian leaf with Leaf Spot",
42
- 11: "Healthy Durian leaf",
43
- 12: "Grape leaf with Black rot",
44
- 13: "Grape leaf with Esca (Black Measles)",
45
- 14: "Grape leaf with Leaf blight (Isariopsis Leaf Spot)",
46
- 15: "Healthy Grape leaf",
47
- 16: "Oil Palm leaf with brown spots",
48
- 17: "Healthy Oil Palm leaf",
49
- 18: "Oil Palm leaf with white scale",
50
- 19: "Orange leaf with Huanglongbing (Citrus greening)",
51
- 20: "Pepper bell leaf with Bacterial spot",
52
- 21: "Healthy Pepper bell leaf",
53
- 22: "Potato leaf with Early blight",
54
- 23: "Potato leaf with Late blight",
55
- 24: "Healthy Potato leaf",
56
- 25: "Rice leaf with Bacterial blight",
57
- 26: "Rice leaf with Blast",
58
- 27: "Rice leaf with Brown spot",
59
- 28: "Rice leaf with Tungro",
60
- 29: "Healthy Soybean leaf",
61
- 30: "Strawberry leaf with Leaf scorch",
62
- 31: "Healthy Strawberry leaf",
63
- 32: "Tomato leaf with Bacterial spot",
64
- 33: "Tomato leaf with Early blight",
65
- 34: "Tomato leaf with Late blight",
66
- 35: "Tomato leaf with Leaf Mold",
67
- 36: "Tomato leaf with Septoria leaf spot",
68
- 37: "Tomato leaf with Spider mites (Two-spotted spider mite)",
69
- 38: "Tomato leaf with Target Spot",
70
- 39: "Tomato leaf with Tomato Yellow Leaf Curl Virus",
71
- 40: "Tomato leaf with Tomato mosaic virus",
72
- 41: "Healthy Tomato leaf"
73
- }
74
- ```
75
- ## Usage
76
-
77
- You can use the `clip-vit-large-patch14-finetuned-disease` model to classify images of plant leaves and generate captions describing their health condition or any disease present. Below is an example of how you can use this model in Python using the Hugging Face Transformers library:
78
-
79
- ```python
80
- from transformers import CLIPProcessor, CLIPModel
81
- from PIL import Image
82
- import requests
83
-
84
- # Load the model and processor
85
- model = CLIPModel.from_pretrained("Keetawan/clip-vit-large-patch14-plant-disease-finetuned")
86
- processor = CLIPProcessor.from_pretrained("Keetawan/clip-vit-large-patch14-plant-disease-finetuned")
87
-
88
- # Load an image of a plant leaf
89
- image_url = "https://example.com/path_to_your_image.jpg"
90
- image = Image.open(requests.get(image_url, stream=True).raw)
91
-
92
- # Prepare the image
93
- inputs = processor(text=["Apple leaf with Apple scab", "Healthy Tomato leaf", ...], images=image, return_tensors="pt", padding=True)
94
-
95
- # Get predictions
96
- outputs = model(**inputs)
97
- logits_per_image = outputs.logits_per_image # Image-text similarity score
98
- probs = logits_per_image.softmax(dim=1) # Convert logits to probabilities
99
-
100
- # Print the most likely label
101
- predicted_label = probs.argmax().item()
102
- labels = [
103
- "Apple leaf with Apple scab",
104
- "Apple leaf with Black rot",
105
- ...
106
- "Healthy Tomato leaf"
107
- ]
108
- print(f"Predicted label: {labels[predicted_label]}")
109
- ```
110
-
111
- ## Citation
112
-
113
- If you use this model in your research or applications, please cite it as follows:
114
- ```
115
- @misc{keetawan2024plantdisease,
116
- author = {Keetawan Limaroon},
117
- title = {clip-vit-large-patch14-finetuned-disease: A fine-tuned model for plant disease classification and captioning},
118
- year = {2024},
119
- publisher = {Hugging Face},
120
- url = {https://huggingface.co/Keetawan/clip-vit-large-patch14-plant-disease-finetuned},
121
- }
122
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
.ipynb_checkpoints/model-checkpoint.safetensors DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:57ec4390f469f947df3ccb2d4eed9ef55a8d69d8319854d3673f42788ea3b376
3
- size 135