import gradio as gr from fastai.vision.all import * import json # Load the category-to-name mapping with open('cat_to_name.json', 'r') as f: cat_to_name = json.load(f) learn = load_learner('flower_classifier.pkl') labels = learn.dls.vocab def predict(img): img = PILImage.create(img) _,_,probs = learn.predict(img) predictions = {labels[i]: float(probs[i]) for i in range(len(labels))} predictions_with_names = { cat_to_name[str(label)]: prob for label, prob in predictions.items() } return predictions_with_names title = "

Flower Classifier

" description = "

An introductory project using fastai for transfer learning using an image classification model, Gradio to demo it on a web app, and HuggingFace Spaces for deployment. I used the ResNet34 architecture on the Oxford Flowers 102 dataset, with a random 80%/20% train/test split, input resizing to 224x224x3, batch data augmentation, a learning rate found by `lr_find()`, only 2 training epochs, and the rest of the hyperparameters as fastai defaults. As someone who's learned neural networks from the bottom up with a strong theoretical foundation, it was fun to see how \"easy\" ML can be for simpler tasks, as the model achieves 91% test accuracy (while a random guess would yield 1% accuracy)!

Feel free to browse the example images below (10 are from the test set, and 2 are my own out-of-distribution images) or upload your own image of a flower. The model may have overfit to the training distribution, as it doesn't generalize well to images with cluttered backgrounds (see my dahlia photo and my tulip photo) and has 100% certainty of correct guesses for some examples in the test set.

The Oxford Flowers 102 dataset, created by the University of Oxford’s Visual Geometry Group, consists of 8,189 images spanning 102 flower species, designed to challenge fine-grained image classification models. With varying lighting, backgrounds, and an uneven class distribution, it serves as a benchmark for testing model robustness and optimizing classification accuracy, making it popular for transfer learning experiments with models like VGG16, ResNet, and EfficientNet." labels_table = """

Classes included in training:

alpine sea holly anthurium artichoke azalea ball moss balloon flower
barbeton daisy bearded iris bee balm bird of paradise bishop of llandaff black-eyed susan
blackberry lily blanket flower bolero deep blue bougainvillea bromelia buttercup
californian poppy camellia canna lily canterbury bells cape flower carnation
cautleya spicata clematis columbine colt's foot common dandelion corn poppy
cyclamen daffodil desert-rose english marigold fire lily foxglove
frangipani fritillary garden phlox gaura gazania geranium
giant white arum lily globe thistle globe-flower grape hyacinth great masterwort hard-leaved pocket orchid
hibiscus hippeastrum japanese anemone king protea lenten rose lotus
love in the mist magnolia mallow marigold mexican aster mexican petunia
monkshood moon orchid morning glory orange dahlia osteospermum oxeye daisy
passion flower pelargonium peruvian lily petunia pincushion flower pink primrose
pink-yellow dahlia poinsettia primula prince of wales feathers purple coneflower red ginger
rose ruby-lipped cattleya siam tulip silverbush snapdragon spear thistle
spring crocus stemless gentian sunflower sweet pea sweet william sword lily
thorn apple tiger lily toad lily tree mallow tree poppy trumpet creeper
wallflower water lily watercress wild pansy windflower yellow iris
""" # Make examples a list of all image filenames in the examples folder examples = ["examples/" + filename for filename in os.listdir("examples")] with gr.Blocks() as demo: gr.HTML(title) gr.HTML(description) gr.Interface(fn=predict, inputs=gr.Image(type="pil"), outputs=gr.Label(num_top_classes=3), examples=examples) gr.HTML(labels_table) if __name__ == "__main__": demo.launch()