Butterfly Classification Using Deep Learning
Introduction:
Embarking on the enchanting exploration of "Butterfly Classification using Deep Learning," our project harnesses the prowess of the MobileNetV2 algorithm within the TensorFlow framework to create an intricate model. This neural network, trained on a diverse dataset of butterfly images, achieves an impressive 98% accuracy during training and maintains a robust 95% accuracy in testing. Beyond the model's success, we extend the project's accessibility by developing a Gradio-based machine learning app. This intuitive application allows users to effortlessly input butterfly images, receiving swift and accurate predictions regarding the species classification. In this convergence of technology and nature, our project not only showcases the capabilities of deep learning in image recognition but also opens avenues for broader applications in ecological research and education.
Dataset
For this project, i collected dataset from "https://www.kaggle.com/datasets/gpiosenka/butterfly-images40-species". Which contains Train, Test.Validation data set for 100 butterfly or moth species. All images are 224 X 224 X 3 in jpg format.
Model Architecture
I applied MobilenetV2 to train my model. I employ the Adam optimizer with a conservative learning rate of 0.0001. This is the model summary:
The following shows the model performance:
I have launched a Gradio-based ML app on this...check it out here "https://github.com/Nishat5349/ML-app-for-Classification"
ML-app-for-Classification
The project involves creating a Gradio app for butterfly classification. It utilizes a pre-trained butterfly classification model (model.h5) and class labels stored in a CSV file (classLabel1.csv). Users can upload images of butterflies to the app, and it provides real-time predictions for the butterfly's class label.Users can upload images of butterflies to the app, and it provides real-time predictions for the butterfly's class label. The code integrates Gradio, TensorFlow, and pandas to create an interactive and user-friendly interface for butterfly classification.
You can check it here "https://github.com/Nishat5349/butterflyClassification" for a clear concept on my trained mode "model.h5"
The source code combines the Gradio library, TensorFlow, and pandas to create a user-friendly interface for butterfly classification. The provided Gradio app enables users to interact with the pre-trained model and obtain predictions with ease.
Here's a step-by-step description of how the code sets up a Gradio app for butterfly classification:
- Importing Libraries:
- The code starts by importing the necessary libraries:
gradio: Used for creating user interfaces for machine learning models.tensorflow: Used for loading the pre-trained model.pandas: Used for reading class labels from a CSV file.numpy: Used for numerical operations.
- The code starts by importing the necessary libraries:
import gradio as gr
import tensorflow as tf
from tensorflow.keras.models import load_model
from tensorflow.keras.preprocessing import image
from tensorflow.keras.applications.mobilenet_v2 import preprocess_input
import pandas as pd
import numpy as np
- Loading the Trained Model:
- The code loads the pre-trained butterfly classification model (
model.h5) using TensorFlow'sload_modelfunction.
- The code loads the pre-trained butterfly classification model (
model_path = '/content/model.h5'
model = load_model(model_path)
- Loading Class Labels from CSV:
- The code reads class labels from a CSV file (
classLabel1.csv) using thepd.read_csvfunction and converts them to a Python list.
- The code reads class labels from a CSV file (
csv_file_path = '/content/classLabel1.csv'
df_class_labels = pd.read_csv(csv_file_path)
class_labels = df_class_labels['ClassLabel'].tolist()
- Defining the Prediction Function:
- The
predict_butterflyfunction is defined to take an image as input, preprocess it, make predictions using the loaded model, and return the predicted class label.
- The
def predict_butterfly(img):
# Preprocess the image
img_array = image.img_to_array(img)
img_array = np.expand_dims(img_array, axis=0)
img_array = preprocess_input(img_array)
# Make predictions using the loaded model
predictions = model.predict(img_array)
# Custom decoding based on my model's classes
top_prediction_index = np.argmax(predictions)
top_prediction = class_labels[top_prediction_index]
return top_prediction
- Creating Gradio Interface:
- The
gr.Interfaceclass is used to create the Gradio interface:fn: Specifies the prediction function.inputs: Set togr.Image()to accept an image as input.outputs: Set togr.Textbox()to display the predicted class label as text.live: Set toTruefor live updates.title: Sets the title of the Gradio interface.description: Provides a description of the interface.
- The
iface = gr.Interface(
fn=predict_butterfly,
inputs=gr.Image(),
outputs=gr.Textbox(),
live=True,
title='Butterfly Classification App',
description='Upload an image of a butterfly to get the predicted class label.'
)
- Launching the Gradio App:
- The
iface.launch()method is called to launch the Gradio app, providing a link to the app.
- The
iface.launch()
- Running the Code:
- After running the code, users can click on the provided link to open the Gradio app in a new tab.
- The app allows users to upload an image of a butterfly, and it provides real-time predictions for the butterfly's class label.
This the interface:
This description outlines how the code combines the Gradio library, TensorFlow, and pandas to create a user-friendly interface for butterfly classification. The provided Gradio app enables users to interact with the pre-trained model and obtain predictions with ease.
Contributor
NISHAT TASNIM (nishattasnim296318@gmail.com)