Rice Leaf Disease Classification using Resource-Efficient Deep Learning Models via Response-Based Knowledge Distillation
To view our LiteCNN4Rice steamlit app click link https://litecnn4rice.streamlit.app/
Welcome to the LiteCNNRice repository, where we demonstrate an advanced approach to rice leaf disease classification using resource-efficient deep learning models via Response-Based Knowledge Distillation (KD). This method enables the transfer of knowledge from a complex teacher model to a smaller student model, ensuring both high classification accuracy and a low computational footprint.
"Empowering precision agriculture through lightweight AI models."
π This repository contains:
- A lightweight model trained using KD.
- Model analysis and performance evaluation.
- The dataset used for training, validation, and testing.
The associated paper titled "Rice Leaf Disease Classification using Resource-Efficient Deep Learning Models via Response-Based Knowledge Distillation" outlines the methodology and results.
model.h5: The trained student model after distillation (112KB).model.json: Architecture of the model in JSON format.
Model_Analysis.ipynb: Jupyter notebook that details the modelβs performance evaluation and metrics, such as accuracy and confusion matrix.
train/: Training dataset folder with labeled images.val/: Validation dataset folder.test/: Test dataset folder.
By utilizing Response-Based Knowledge Distillation, this project demonstrates how to train a smaller model while preserving or even enhancing classification performance. This approach significantly reduces computational requirements, making it suitable for deployment in resource-constrained environments like mobile devices or embedded systems.
The model is designed to classify various types of rice leaf diseases based on image inputs. Early disease detection can help farmers take preventative action and improve crop yields.
Ideal for edge devices and mobile apps.
This project explores resource-efficient deep learning by leveraging Response-Based Knowledge Distillation (KD) to classify rice leaf diseases. The aim is to create lightweight yet accurate models suitable for resource-constrained environments like mobile or IoT devices.
Knowledge distillation involves training a smaller student model to replicate the performance of a larger teacher model by learning from its predictions.
- Response-Based KD: Aligns teacher and student model outputs using metrics like Kullback-Leibler divergence.
- Feature-Based KD: Transfers intermediate representations from teacher to student.
- Relation-Based KD: Preserves pairwise relationships between data samples.
For this project, Response-Based KD is used due to its simplicity and effectiveness in CNN models.
The dataset used is the Rice Leaf Disease Dataset from Mendeley Data, containing 5,932 images across four rice leaf diseases:
- Bacterial Blight
- Blast
- Brown Spot
- Tungro
| Disease Type | Percentage (%) | Training Set | Validation Set | Test Set |
|---|---|---|---|---|
| Bacterial Blight | 26.7% | 1267 | 158 | 159 |
| Blast | 24.3% | 1152 | 144 | 144 |
| Brown Spot | 27.0% | 1280 | 160 | 160 |
| Tungro | 22.0% | 1046 | 130 | 132 |
| Total | 100% | 4745 | 592 | 595 |
Below are sample images from the dataset:
| Bacterial Blight | Blast | Brown Spot | Tungro |
|---|---|---|---|
![]() |
![]() |
![]() |
Response-Based KD enhances model efficiency by transferring soft predictions from a teacher model to a student model. The student uses these probabilistic outputs, alongside true labels, to generalize better.
- Efficiency: Reduces model size for faster inference.
- Deployment: Ideal for edge devices, IoT, and mobile platforms.
To run the code and evaluate the model, you will need the following:
- Python 3.10
- TensorFlow(version==2.15.0) / Keras
- NumPy, Pandas, Matplotlib, Seaborn (for data manipulation and visualization)
- OpenCV (for image processing)
- Lightweight Models: Optimized for resource-constrained environments.
- High Accuracy: Maintains predictive performance.
- Flexible Deployment: Designed for mobile and IoT devices.
- Hinton et al. (2015): Distilling the Knowledge in a Neural Network
We would like to acknowledge the following contributors to this project:
| Contributor Name | Affiliation | Photo |
|---|---|---|
| Maimunul Karim Jisan | Department of Computer Science and Engineering, East Delta University, Chittagong, Bangladesh | ![]() |
| Kazi Ekramul Hoque | School of Information and Communication Technology, Griffith University, Brisbane, Australia Google Scholar | |
| Tanvir Azhar | Department of Computer Science and Engineering, East Delta University, Chittagong, Bangladesh | ![]() |
| M.A. Hakim Newton | School of Information and Physical Sciences, The University of Newcastle, Australia Google Scholar | ![]() |
You can install the required dependencies using the following:
pip install -r requirements.txt





