File size: 2,322 Bytes
a6d148b
77c17f0
 
a6d148b
77c17f0
a6d148b
 
 
 
 
77c17f0
43812eb
77c17f0
 
 
 
 
 
43812eb
 
 
77c17f0
 
 
 
43812eb
77c17f0
43812eb
77c17f0
43812eb
77c17f0
 
 
 
 
 
43812eb
77c17f0
e18ba2a
77c17f0
e18ba2a
77c17f0
 
 
 
 
 
 
 
 
 
 
e18ba2a
77c17f0
e18ba2a
77c17f0
 
e18ba2a
77c17f0
 
 
 
 
 
 
 
 
e18ba2a
77c17f0
43812eb
77c17f0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
title: iris-backend
emoji: 🌸
colorFrom: green
colorTo: blue
sdk: docker
pinned: false
license: mit
---

# 🌸 Iris Flower Classifier (Flask API)

A lightweight **Flask API** hosted on [Hugging Face Spaces](https://huggingface.co/spaces) that predicts the species of an Iris flower using a **Scikit-learn Logistic Regression model**.  

The **frontend UI** is hosted separately on GitHub Pages:  
πŸ‘‰ [Try it here](https://lovnishverma.github.io/iris-front/)

---

## πŸš€ How it works

1. User enters flower measurements (sepal & petal length/width) on the frontend.  
2. Frontend sends data as JSON to this backend API (`/predict` endpoint).  
3. The Flask app loads a pickled Scikit-learn model and predicts the species.  
4. Response is returned as JSON and displayed on the frontend.  

---

## πŸ”§ Tech Stack

- **Python 3.9+**
- **Flask** – Web framework
- **Flask-CORS** – Allow frontend/backend communication
- **Scikit-learn** – ML model (Logistic Regression)
- **Pickle** – Model persistence
- **Hugging Face Spaces (Docker SDK)** – Deployment

---

## πŸ“‚ File Structure

```

β”œβ”€β”€ app.py            # Flask API
β”œβ”€β”€ model.pkl         # Pre-trained ML model
β”œβ”€β”€ requirements.txt  # Python dependencies
β”œβ”€β”€ Dockerfile        # Custom Space runtime
└── README.md         # Project documentation

````

---

## πŸ“‘ API Usage

### Endpoint
`POST /predict`

### Request (JSON)
```json
{
  "sepal_length": 5.1,
  "sepal_width": 3.5,
  "petal_length": 1.4,
  "petal_width": 0.2
}
````

### Response (JSON)

```json
{
  "prediction": "setosa"
}
```

---

## πŸ› οΈ Run Locally

Clone and run with Docker:

```bash
git clone https://huggingface.co/spaces/<your-username>/iris-backend
cd iris-backend
docker build -t iris-backend .
docker run -p 7860:7860 iris-backend
```

Now open [http://localhost:7860/predict](http://localhost:7860/predict).

---

## 🌐 Frontend

The frontend is hosted on GitHub Pages:
πŸ‘‰ [Iris Classifier Frontend](https://lovnishverma.github.io/iris-front/)

Frontend calls the backend API hosted here on Hugging Face Spaces.

---

## πŸ“Έ Demo Screenshot



![Demo](https://cdn-uploads.huggingface.co/production/uploads/6474405f90330355db146c76/w60iyWrc6vmicBW8e0SEJ.png)

---

## πŸ“œ License

MIT License – free to use and modify.

---