AdityaAdaki commited on
Commit
b7729a0
·
1 Parent(s): f93bf21

added ollama library link

Browse files
Files changed (1) hide show
  1. README.md +16 -16
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
- title: "AgriAssist-LLM: Plant Disease Information Assistant"
3
- description: "AgriAssist-LLM is a fine-tuned large language model based on Llama3.2:1B, designed to assist Indian farmers with plant disease identification and management."
4
  version: "1.0"
5
  author: "Sike Aditya"
6
  repository: "https://huggingface.co/sikeaditya/agri_assist_llm"
@@ -21,7 +21,7 @@ usage:
21
  installation:
22
  ollama: "curl -fsSL https://ollama.ai/install.sh | sh"
23
  usage_examples:
24
- - command: "ollama run AgriAssist-LLM 'Explain Red Rot in sugarcane in simple terms for Indian farmers.'"
25
  description: "Provides an easy-to-understand explanation of Red Rot disease in sugarcane."
26
  dataset:
27
  crops:
@@ -52,9 +52,9 @@ contact:
52
  email: "[email protected]"
53
  issues: "https://github.com/sikeaditya/agri_assist_llm/issues"
54
  ---
55
- # AgriAssist-LLM: Plant Disease Information Assistant
56
 
57
- AgriAssist-LLM is a fine-tuned large language model based on Llama3.2:1B, specifically designed to provide detailed, actionable information about plant diseases to Indian farmers. It offers clear, concise, and locally relevant guidance on disease identification, symptoms, causes, severity, and treatment measures across major crops such as Sugarcane, Maize, Cotton, Rice, and Wheat.
58
 
59
  ## Features
60
 
@@ -71,7 +71,7 @@ AgriAssist-LLM is a fine-tuned large language model based on Llama3.2:1B, specif
71
 
72
  ## Installation
73
 
74
- To use AgriAssist-LLM, install the required libraries:
75
 
76
  ```bash
77
  pip install transformers torch
@@ -81,13 +81,13 @@ pip install transformers torch
81
 
82
  ### Using Hugging Face Transformers
83
 
84
- Here’s an example of how to use AgriAssist-LLM with the Hugging Face Transformers library:
85
 
86
  ```python
87
  from transformers import AutoTokenizer, AutoModelForCausalLM
88
  # Load the tokenizer and model from the Hugging Face Hub
89
- tokenizer = AutoTokenizer.from_pretrained("your-username/AgriAssist-LLM")
90
- model = AutoModelForCausalLM.from_pretrained("your-username/AgriAssist-LLM")
91
  # Define a prompt
92
  prompt = "Explain Red Rot in sugarcane in simple terms for Indian farmers."
93
  # Tokenize and generate a response
@@ -97,11 +97,11 @@ outputs = model.generate(**inputs, max_new_tokens=256)
97
  print(tokenizer.decode(outputs[0], skip_special_tokens=True))
98
  ```
99
 
100
- *Note:* Replace `your-username/AgriAssist-LLM` with the actual path of your repository.
101
 
102
  ### Using Ollama
103
 
104
- You can also use AgriAssist-LLM with [Ollama](https://ollama.ai), a simple way to run large language models locally.
105
 
106
  1. Install Ollama if you haven't already:
107
 
@@ -109,22 +109,22 @@ You can also use AgriAssist-LLM with [Ollama](https://ollama.ai), a simple way t
109
  curl -fsSL https://ollama.ai/install.sh | sh
110
  ```
111
 
112
- 2. Pull the model from Hugging Face and convert it to an Ollama-compatible format.
113
  ```bash
114
- ollama run hf.co/sikeaditya/agri_assist_llm
115
  ```
116
 
117
  3. Run the model using Ollama:
118
 
119
  ```bash
120
- ollama run AgriAssist-LLM "Explain Red Rot in sugarcane in simple terms for Indian farmers."
121
  ```
122
 
123
  This will generate a response based on the model’s fine-tuned dataset.
124
 
125
  ## Fine-Tuning and Training
126
 
127
- AgriAssist-LLM was fine-tuned using a custom dataset created in the Alpaca Instruct Format. The dataset covers detailed plant disease information tailored to the Indian context and includes samples for:
128
 
129
  - **Sugarcane:** Bacterial Blight, Healthy, Red Rot
130
  - **Maize:** Blight, Common Rust, Gray Leaf Spot, Healthy
@@ -143,5 +143,5 @@ For questions or suggestions, please open an issue in the repository or contact
143
 
144
  ---
145
 
146
- Happy farming with AgriAssist-LLM!
147
 
 
1
  ---
2
+ title: "AgriLlama: Plant Disease Information Assistant"
3
+ description: "AgriLlama is a fine-tuned large language model based on Llama3.2:1B, designed to assist Indian farmers with plant disease identification and management."
4
  version: "1.0"
5
  author: "Sike Aditya"
6
  repository: "https://huggingface.co/sikeaditya/agri_assist_llm"
 
21
  installation:
22
  ollama: "curl -fsSL https://ollama.ai/install.sh | sh"
23
  usage_examples:
24
+ - command: "ollama run AgriLlama 'Explain Red Rot in sugarcane in simple terms for Indian farmers.'"
25
  description: "Provides an easy-to-understand explanation of Red Rot disease in sugarcane."
26
  dataset:
27
  crops:
 
52
  email: "[email protected]"
53
  issues: "https://github.com/sikeaditya/agri_assist_llm/issues"
54
  ---
55
+ # AgriLlama: Plant Disease Information Assistant
56
 
57
+ AgriLlama is a fine-tuned large language model based on Llama3.2:1B, specifically designed to provide detailed, actionable information about plant diseases to Indian farmers. It offers clear, concise, and locally relevant guidance on disease identification, symptoms, causes, severity, and treatment measures across major crops such as Sugarcane, Maize, Cotton, Rice, and Wheat.
58
 
59
  ## Features
60
 
 
71
 
72
  ## Installation
73
 
74
+ To use AgriLlama, install the required libraries:
75
 
76
  ```bash
77
  pip install transformers torch
 
81
 
82
  ### Using Hugging Face Transformers
83
 
84
+ Here’s an example of how to use AgriLlama with the Hugging Face Transformers library:
85
 
86
  ```python
87
  from transformers import AutoTokenizer, AutoModelForCausalLM
88
  # Load the tokenizer and model from the Hugging Face Hub
89
+ tokenizer = AutoTokenizer.from_pretrained("your-username/AgriLlama")
90
+ model = AutoModelForCausalLM.from_pretrained("your-username/AgriLlama")
91
  # Define a prompt
92
  prompt = "Explain Red Rot in sugarcane in simple terms for Indian farmers."
93
  # Tokenize and generate a response
 
97
  print(tokenizer.decode(outputs[0], skip_special_tokens=True))
98
  ```
99
 
100
+ *Note:* Replace `your-username/AgriLlama` with the actual path of your repository.
101
 
102
  ### Using Ollama
103
 
104
+ You can also use AgriLlama with [Ollama](https://ollama.ai), a simple way to run large language models locally.
105
 
106
  1. Install Ollama if you haven't already:
107
 
 
109
  curl -fsSL https://ollama.ai/install.sh | sh
110
  ```
111
 
112
+ 2. Pull the model from Ollama [Library](https://ollama.com/sike_aditya/AgriLlama)
113
  ```bash
114
+ ollama pull sike_aditya/AgriLlama
115
  ```
116
 
117
  3. Run the model using Ollama:
118
 
119
  ```bash
120
+ ollama run AgriLlama "Explain Red Rot in sugarcane in simple terms for Indian farmers."
121
  ```
122
 
123
  This will generate a response based on the model’s fine-tuned dataset.
124
 
125
  ## Fine-Tuning and Training
126
 
127
+ AgriLlama was fine-tuned using a custom dataset created in the Alpaca Instruct Format. The dataset covers detailed plant disease information tailored to the Indian context and includes samples for:
128
 
129
  - **Sugarcane:** Bacterial Blight, Healthy, Red Rot
130
  - **Maize:** Blight, Common Rust, Gray Leaf Spot, Healthy
 
143
 
144
  ---
145
 
146
+ Happy farming with AgriLlama!
147