Spaces:
Sleeping
Sleeping
File size: 2,322 Bytes
a6d148b 77c17f0 a6d148b 77c17f0 a6d148b 77c17f0 43812eb 77c17f0 43812eb 77c17f0 43812eb 77c17f0 43812eb 77c17f0 43812eb 77c17f0 43812eb 77c17f0 e18ba2a 77c17f0 e18ba2a 77c17f0 e18ba2a 77c17f0 e18ba2a 77c17f0 e18ba2a 77c17f0 e18ba2a 77c17f0 43812eb 77c17f0 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 | ---
title: iris-backend
emoji: πΈ
colorFrom: green
colorTo: blue
sdk: docker
pinned: false
license: mit
---
# πΈ Iris Flower Classifier (Flask API)
A lightweight **Flask API** hosted on [Hugging Face Spaces](https://huggingface.co/spaces) that predicts the species of an Iris flower using a **Scikit-learn Logistic Regression model**.
The **frontend UI** is hosted separately on GitHub Pages:
π [Try it here](https://lovnishverma.github.io/iris-front/)
---
## π How it works
1. User enters flower measurements (sepal & petal length/width) on the frontend.
2. Frontend sends data as JSON to this backend API (`/predict` endpoint).
3. The Flask app loads a pickled Scikit-learn model and predicts the species.
4. Response is returned as JSON and displayed on the frontend.
---
## π§ Tech Stack
- **Python 3.9+**
- **Flask** β Web framework
- **Flask-CORS** β Allow frontend/backend communication
- **Scikit-learn** β ML model (Logistic Regression)
- **Pickle** β Model persistence
- **Hugging Face Spaces (Docker SDK)** β Deployment
---
## π File Structure
```
βββ app.py # Flask API
βββ model.pkl # Pre-trained ML model
βββ requirements.txt # Python dependencies
βββ Dockerfile # Custom Space runtime
βββ README.md # Project documentation
````
---
## π‘ API Usage
### Endpoint
`POST /predict`
### Request (JSON)
```json
{
"sepal_length": 5.1,
"sepal_width": 3.5,
"petal_length": 1.4,
"petal_width": 0.2
}
````
### Response (JSON)
```json
{
"prediction": "setosa"
}
```
---
## π οΈ Run Locally
Clone and run with Docker:
```bash
git clone https://huggingface.co/spaces/<your-username>/iris-backend
cd iris-backend
docker build -t iris-backend .
docker run -p 7860:7860 iris-backend
```
Now open [http://localhost:7860/predict](http://localhost:7860/predict).
---
## π Frontend
The frontend is hosted on GitHub Pages:
π [Iris Classifier Frontend](https://lovnishverma.github.io/iris-front/)
Frontend calls the backend API hosted here on Hugging Face Spaces.
---
## πΈ Demo Screenshot

---
## π License
MIT License β free to use and modify.
---
|