Skip to content

maimunul/LiteCNN4Rice

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

8 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LiteCNNRice

Rice Leaf Disease Classification using Resource-Efficient Deep Learning Models via Response-Based Knowledge Distillation

To view our LiteCNN4Rice steamlit app click link https://litecnn4rice.streamlit.app/


🌾 Overview 🌾

Welcome to the LiteCNNRice repository, where we demonstrate an advanced approach to rice leaf disease classification using resource-efficient deep learning models via Response-Based Knowledge Distillation (KD). This method enables the transfer of knowledge from a complex teacher model to a smaller student model, ensuring both high classification accuracy and a low computational footprint.

"Empowering precision agriculture through lightweight AI models."

πŸ” This repository contains:

  • A lightweight model trained using KD.
  • Model analysis and performance evaluation.
  • The dataset used for training, validation, and testing.

The associated paper titled "Rice Leaf Disease Classification using Resource-Efficient Deep Learning Models via Response-Based Knowledge Distillation" outlines the methodology and results.


πŸ“ Contents πŸ“

1. Model Files 🧠

  • model.h5: The trained student model after distillation (112KB).
  • model.json: Architecture of the model in JSON format.

2. Model Analysis πŸ“Š

  • Model_Analysis.ipynb: Jupyter notebook that details the model’s performance evaluation and metrics, such as accuracy and confusion matrix.

3. Dataset 🌾

  • train/: Training dataset folder with labeled images.
  • val/: Validation dataset folder.
  • test/: Test dataset folder.

πŸ”‘ Key Features ✨

1. Resource Efficiency 🌍

By utilizing Response-Based Knowledge Distillation, this project demonstrates how to train a smaller model while preserving or even enhancing classification performance. This approach significantly reduces computational requirements, making it suitable for deployment in resource-constrained environments like mobile devices or embedded systems.

2. Classification Task 🦠

The model is designed to classify various types of rice leaf diseases based on image inputs. Early disease detection can help farmers take preventative action and improve crop yields.

3. Easy Deployment 🦠

Ideal for edge devices and mobile apps.


πŸ“– Introduction

This project explores resource-efficient deep learning by leveraging Response-Based Knowledge Distillation (KD) to classify rice leaf diseases. The aim is to create lightweight yet accurate models suitable for resource-constrained environments like mobile or IoT devices.


What is Knowledge Distillation?

Knowledge distillation involves training a smaller student model to replicate the performance of a larger teacher model by learning from its predictions.

Categories of KD:

  1. Response-Based KD: Aligns teacher and student model outputs using metrics like Kullback-Leibler divergence.
  2. Feature-Based KD: Transfers intermediate representations from teacher to student.
  3. Relation-Based KD: Preserves pairwise relationships between data samples.

For this project, Response-Based KD is used due to its simplicity and effectiveness in CNN models.


πŸ“ Proposed Architecture

Proposed Architecture


🌾 Dataset

The dataset used is the Rice Leaf Disease Dataset from Mendeley Data, containing 5,932 images across four rice leaf diseases:

  • Bacterial Blight
  • Blast
  • Brown Spot
  • Tungro

πŸ“Š Dataset Overview

Disease Type Percentage (%) Training Set Validation Set Test Set
Bacterial Blight 26.7% 1267 158 159
Blast 24.3% 1152 144 144
Brown Spot 27.0% 1280 160 160
Tungro 22.0% 1046 130 132
Total 100% 4745 592 595

πŸ“· Dataset Sample

Below are sample images from the dataset:

Bacterial Blight Blast Brown Spot Tungro
Bacterial Blight Blast Brown Spot Tungro

πŸš€ Response-Based Knowledge Distillation

Response-Based KD enhances model efficiency by transferring soft predictions from a teacher model to a student model. The student uses these probabilistic outputs, alongside true labels, to generalize better.

Why KD?

  • Efficiency: Reduces model size for faster inference.
  • Deployment: Ideal for edge devices, IoT, and mobile platforms.

πŸš€ Getting Started πŸš€

Prerequisites πŸ› οΈ

To run the code and evaluate the model, you will need the following:

  • Python 3.10
  • TensorFlow(version==2.15.0) / Keras
  • NumPy, Pandas, Matplotlib, Seaborn (for data manipulation and visualization)
  • OpenCV (for image processing)

πŸ“Œ Key Features

  • Lightweight Models: Optimized for resource-constrained environments.
  • High Accuracy: Maintains predictive performance.
  • Flexible Deployment: Designed for mobile and IoT devices.

πŸ“š References


Contributors

We would like to acknowledge the following contributors to this project:

Contributor Name Affiliation Photo
Maimunul Karim Jisan Department of Computer Science and Engineering, East Delta University, Chittagong, Bangladesh
Kazi Ekramul Hoque School of Information and Communication Technology, Griffith University, Brisbane, Australia Google Scholar
Tanvir Azhar Department of Computer Science and Engineering, East Delta University, Chittagong, Bangladesh
M.A. Hakim Newton School of Information and Physical Sciences, The University of Newcastle, Australia Google Scholar

You can install the required dependencies using the following:

pip install -r requirements.txt

About

This repo only for paper best model result visualization and analysis of h5 and json file.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors