devrajsinh2012 commited on
Commit
0e88913
·
1 Parent(s): bfd6172

docs: rewrite README with full project docs, architecture, team & contributors

Browse files

- Full architecture diagram (3-pipeline ensemble)
- Complete project structure, tech stack, setup guide
- Deployment instructions for HF Spaces + Vercel
- Team section: Devrajsinh Gohil, Jay Nasit, Dr. Om Prakash Suthar

fix: calibration overlay is now transparent so camera feed stays visible
- UI moved to compact bottom card so user can see their hand during calibration

Files changed (2) hide show
  1. README.md +210 -96
  2. frontend/src/components/Calibration.tsx +67 -55
README.md CHANGED
@@ -1,154 +1,268 @@
 
1
 
2
- # SanketSetu
3
 
4
- A real-time sign language recognition system using machine learning and computer vision.
5
 
6
- ## Project Structure
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
 
8
  ```
9
- Γö£ΓöÇΓöÇ backend/ # FastAPI backend server
10
- Γöé Γö£ΓöÇΓöÇ app/ # Main application code
11
- Γöé Γöé Γö£ΓöÇΓöÇ inference/ # ML inference pipelines
12
- Γöé Γöé ΓööΓöÇΓöÇ models/ # Model loading and management
13
- Γöé ΓööΓöÇΓöÇ tests/ # Backend tests
14
- Γö£ΓöÇΓöÇ frontend/ # React + TypeScript frontend
15
- Γöé ΓööΓöÇΓöÇ src/
16
- Γöé Γö£ΓöÇΓöÇ components/ # React components
17
- Γöé Γö£ΓöÇΓöÇ hooks/ # Custom React hooks
18
- Γöé ΓööΓöÇΓöÇ lib/ # Utility libraries
19
- Γö£ΓöÇΓöÇ CNN_Autoencoder_LightGBM/ # CNN Autoencoder + LightGBM model
20
- Γö£ΓöÇΓöÇ CNN_PreTrained/ # CNN + SVM model
21
- ΓööΓöÇΓöÇ Mediapipe_XGBoost/ # MediaPipe + XGBoost model
 
 
 
 
22
  ```
23
 
24
- ## Features
25
 
26
- - Real-time sign language gesture recognition
27
- - Multiple ML model ensemble approach
28
- - WebSocket-based real-time communication
29
- - MediaPipe hand landmark tracking
30
- - Interactive webcam feed with visual feedback
31
- - Prediction confidence display
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
 
33
- ## Tech Stack
34
 
35
- ### Backend
36
- - FastAPI
37
- - Python 3.x
38
- - PyTorch
39
- - LightGBM
40
- - XGBoost
41
- - MediaPipe
42
 
43
- ### Frontend
44
- - React
45
- - TypeScript
46
- - Vite
47
- - TailwindCSS
 
 
 
 
 
48
 
49
- ## Getting Started
 
 
50
 
51
  ### Prerequisites
52
- - Python 3.8+
53
- - Node.js 16+
 
54
  - npm or yarn
55
 
56
- ### Backend Setup
 
 
 
 
 
 
 
57
 
58
  ```bash
59
  cd backend
60
  pip install -r requirements.txt
 
61
  python -m app.main
62
  ```
63
 
64
- ### Frontend Setup
65
 
66
  ```bash
67
  cd frontend
68
  npm install
 
69
  npm run dev
70
  ```
71
 
72
- ## Development
73
 
74
- Run the development servers:
75
-
76
- ```bash
77
- # Start both frontend and backend
78
  .\start.ps1
79
  ```
80
 
81
- ## Deployment
 
 
82
 
83
- ### Backend ΓÇö Hugging Face Spaces (Docker SDK)
 
 
 
84
 
85
- The backend is deployed as a [Hugging Face Space](https://huggingface.co/spaces) using the Docker SDK.
86
 
87
- **Steps to create a new Space and push:**
88
 
89
- 1. **Create the Space** on [huggingface.co/new-space](https://huggingface.co/new-space)
90
- - SDK: **Docker**
91
- - Visibility: Public (or Private)
92
- - Note your `username` and `space-name`
93
 
94
- 2. **Clone the Space repo and push your code:**
95
- ```bash
96
- # Add HF Space as a remote (from repo root)
97
- git remote add space https://huggingface.co/spaces/devrajsinh2012/Sanket-Setu
98
 
99
- git push space main
100
- ```
101
- HF Spaces will automatically build the Docker image and start the container.
102
 
103
- 3. **Set Space Secrets** (via HF Space → Settings → Repository secrets):
104
- | Secret | Example value |
105
- |--------|---------------|
106
- | `CORS_ORIGINS` | `https://sanketsetu.vercel.app,http://localhost:5173` |
107
- | `PIPELINE_MODE` | `ensemble` |
108
- | `CONFIDENCE_THRESHOLD` | `0.70` |
109
 
110
- 4. **Update the frontend** ΓÇö set the `VITE_WS_URL` Vercel environment variable:
111
- ```
112
- wss://devrajsinh2012-sanket-setu.hf.space
113
- ```
114
- In Vercel dashboard: **Settings → Environment Variables → VITE_WS_URL**
115
 
116
- **Space URL format:**
117
- - HTTPS API: `https://devrajsinh2012-sanket-setu.hf.space`
118
- - WebSocket: `wss://devrajsinh2012-sanket-setu.hf.space/ws/landmarks`
119
- - Health: `https://devrajsinh2012-sanket-setu.hf.space/health`
120
 
121
- ### Frontend ΓÇö Vercel
122
 
123
  ```bash
124
- cd frontend
125
- # deploy via Vercel CLI or connect the GitHub repo in Vercel dashboard
126
  ```
127
 
128
- Set the `VITE_WS_URL` environment variable in Vercel to the HF Space WebSocket URL above.
129
 
130
- ## Docker (local)
131
 
132
- Build and run using Docker locally:
 
 
 
 
133
 
134
- ```bash
135
- docker build -t sanketsetu .
136
- docker run -p 7860:7860 sanketsetu
137
- ```
138
 
139
- ## Testing
 
 
 
140
 
141
- Run backend tests:
142
 
143
- ```bash
144
- cd backend
145
- pytest
146
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
147
 
148
- ## License
149
 
150
- All rights reserved.
151
 
152
- ## Author
153
 
154
- Devrajsinh Gohil (devrajsinh2012)
 
1
+ <div align="center">
2
 
3
+ # 🤟 SanketSetu | સંકેત-સેતુ
4
 
5
+ **Real-time Gujarati Sign Language Recognition System**
6
 
7
+ [![Live Demo](https://img.shields.io/badge/Live%20Demo-Vercel-black?style=for-the-badge&logo=vercel)](https://sanket-setu.vercel.app)
8
+ [![Backend](https://img.shields.io/badge/Backend-HuggingFace%20Spaces-yellow?style=for-the-badge&logo=huggingface)](https://huggingface.co/spaces/devrajsinh2012/Sanket-Setu)
9
+ [![License](https://img.shields.io/badge/License-All%20Rights%20Reserved-red?style=for-the-badge)](#license)
10
+
11
+ </div>
12
+
13
+ ---
14
+
15
+ ## 📖 About
16
+
17
+ **SanketSetu** (Sanskrit: *Sanket* = gesture/sign, *Setu* = bridge) is a production-grade, real-time **Gujarati Sign Language (GSL)** recognition system. It bridges the communication gap between the hearing-impaired community and the broader public by translating hand gestures corresponding to **34 Gujarati consonants** (ક–જ્ઞ) into text — all in real time, directly in the browser.
18
+
19
+ The system uses your device camera, processes hand landmarks locally via **MediaPipe**, and sends them over a **WebSocket** connection to a machine-learning backend that classifies the gesture using a **3-pipeline ensemble** of models.
20
+
21
+ ---
22
+
23
+ ## ✨ Features
24
+
25
+ - **Real-time gesture detection** — sub-100 ms end-to-end latency
26
+ - **3-pipeline ensemble inference** — XGBoost → Autoencoder+LightGBM → CNN+SVM, invoked in order of confidence
27
+ - **34 Gujarati sign classes** — complete consonant alphabet (ક, ખ, ગ … ક્ષ, જ્ઞ)
28
+ - **WebSocket streaming** — live bidirectional communication between browser and backend
29
+ - **MediaPipe hand tracking** — 21 landmark coordinates extracted client-side (no raw video sent to server)
30
+ - **Onboarding wizard** — animated step-by-step guide for first-time users
31
+ - **Calibration screen** — transparent overlay keeps camera feed fully visible while detecting hand readiness
32
+ - **Landmark canvas overlay** — live 21-point skeleton drawn over the webcam feed
33
+ - **Prediction HUD** — displays recognised sign, confidence bar, latency, and prediction history
34
+ - **Low-bandwidth mode** — auto-throttles to 5 fps when latency is high
35
+ - **Docker-ready backend** — deployable on Hugging Face Spaces in one push
36
+
37
+ ---
38
+
39
+ ## 🏗️ System Architecture
40
 
41
  ```
42
+ Browser (React + TypeScript)
43
+
44
+ ├─ MediaPipe Hands (WASM) ← extracts 21 hand landmarks (63 floats) locally
45
+ ├─ WebcamFeed + LandmarkCanvas
46
+ ├─ Calibration / Onboarding UI
47
+
48
+ └─ WebSocket (wss://)
49
+
50
+
51
+ FastAPI Backend (Python)
52
+
53
+ ├─ Pipeline A XGBoost (63 landmarks → 34 classes) ← primary, fastest
54
+ │ └─ if confidence < 0.70
55
+ ├─ Pipeline B — Autoencoder (63→16) + LightGBM ← secondary
56
+ │ └─ if confidence < 0.60 ↓
57
+ └─ Pipeline C — ResNet50 CNN (128×128 image) + SVM ← tertiary
58
+ └─ Ensemble weighted average → final prediction
59
  ```
60
 
61
+ ---
62
 
63
+ ## 📁 Project Structure
64
+
65
+ ```
66
+ SanketSetu/
67
+ ├── backend/
68
+ │ ├── app/
69
+ │ │ ├── main.py # FastAPI entry-point, WebSocket + REST
70
+ │ │ ├── config.py # Settings (thresholds, model paths, env vars)
71
+ │ │ ├── schemas.py # Pydantic request/response models
72
+ │ │ ├── inference/
73
+ │ │ │ ├── pipeline_a.py # XGBoost inference (63 MediaPipe landmarks)
74
+ │ │ │ ├── pipeline_b.py # Autoencoder encoder + LightGBM
75
+ │ │ │ ├── pipeline_c.py # ResNet CNN + SVM (image-based)
76
+ │ │ │ └── ensemble.py # Confidence-weighted ensemble logic
77
+ │ │ └── models/
78
+ │ │ ├── loader.py # Singleton model loader
79
+ │ │ └── label_map.py # Index 0–33 → Gujarati character
80
+ │ ├── tests/ # Pytest test suite
81
+ │ ├── requirements.txt
82
+ │ └── requirements-dev.txt
83
+
84
+ ├── frontend/
85
+ │ ├── src/
86
+ │ │ ├── App.tsx # App shell, stage machine (onboarding → calibration → running)
87
+ │ │ ├── components/
88
+ │ │ │ ├── WebcamFeed.tsx # Webcam stream + canvas overlay
89
+ │ │ │ ├── LandmarkCanvas.tsx # Draws 21-point hand skeleton
90
+ │ │ │ ├── PredictionHUD.tsx # Live sign, confidence bar, latency, history
91
+ │ │ │ ├── OnboardingGuide.tsx # Animated intro wizard
92
+ │ │ │ └── Calibration.tsx # Transparent hand-detection calibration card
93
+ │ │ ├── hooks/
94
+ │ │ │ ├── useWebSocket.ts # WS connection, send/receive
95
+ │ │ │ ├── useMediaPipe.ts # MediaPipe Hands JS integration
96
+ │ │ │ └── useWebcam.ts # Camera permissions + stream
97
+ │ │ └── lib/
98
+ │ │ └── landmarkUtils.ts # Landmark normalisation helpers
99
+ │ ├── .env.production # VITE_WS_URL for Vercel build
100
+ │ ├── vite.config.ts
101
+ │ └── package.json
102
+
103
+ ├── CNN_Autoencoder_LightGBM/ # Autoencoder + LightGBM model weights
104
+ ├── CNN_PreTrained/ # ResNet CNN + SVM model weights
105
+ ├── Mediapipe_XGBoost/ # XGBoost model weights
106
+ ├── Dockerfile # Multi-stage Docker build for HF Spaces
107
+ └── start.ps1 # One-command local dev launcher (Windows)
108
+ ```
109
 
110
+ ---
111
 
112
+ ## 🛠️ Tech Stack
 
 
 
 
 
 
113
 
114
+ | Layer | Technology |
115
+ |---|---|
116
+ | **Frontend** | React 18, TypeScript, Vite, TailwindCSS, Framer Motion |
117
+ | **Hand Tracking** | MediaPipe Hands (browser WASM) |
118
+ | **Real-time Comm.** | WebSocket (native browser API) |
119
+ | **Backend** | FastAPI, Python 3.10+ |
120
+ | **ML — Pipeline A** | XGBoost (scikit-learn API) |
121
+ | **ML — Pipeline B** | Keras/TensorFlow Autoencoder + LightGBM |
122
+ | **ML — Pipeline C** | PyTorch ResNet50 CNN + scikit-learn SVM |
123
+ | **Deployment** | Hugging Face Spaces (Docker SDK) + Vercel |
124
 
125
+ ---
126
+
127
+ ## 🚀 Getting Started
128
 
129
  ### Prerequisites
130
+
131
+ - Python 3.10+
132
+ - Node.js 18+
133
  - npm or yarn
134
 
135
+ ### 1. Clone the repository
136
+
137
+ ```bash
138
+ git clone https://github.com/devrajsinh2012/Sanket-Setu.git
139
+ cd Sanket-Setu
140
+ ```
141
+
142
+ ### 2. Backend Setup
143
 
144
  ```bash
145
  cd backend
146
  pip install -r requirements.txt
147
+ # Starts FastAPI server on http://localhost:8000
148
  python -m app.main
149
  ```
150
 
151
+ ### 3. Frontend Setup
152
 
153
  ```bash
154
  cd frontend
155
  npm install
156
+ # Starts Vite dev server on http://localhost:5173
157
  npm run dev
158
  ```
159
 
160
+ ### 4. One-Command Start (Windows)
161
 
162
+ ```powershell
163
+ # From the repo root — starts both backend and frontend
 
 
164
  .\start.ps1
165
  ```
166
 
167
+ ---
168
+
169
+ ## 🧪 Testing
170
 
171
+ ```bash
172
+ cd backend
173
+ pytest tests/ -v
174
+ ```
175
 
176
+ ---
177
 
178
+ ## 🐳 Docker (Local)
179
 
180
+ ```bash
181
+ # Build the image
182
+ docker build -t sanketsetu .
 
183
 
184
+ # Run on port 7860 (matches HF Spaces)
185
+ docker run -p 7860:7860 sanketsetu
186
+ ```
 
187
 
188
+ ---
 
 
189
 
190
+ ## ☁️ Deployment
 
 
 
 
 
191
 
192
+ ### Backend Hugging Face Spaces
 
 
 
 
193
 
194
+ The backend runs as a [Hugging Face Space](https://huggingface.co/spaces/devrajsinh2012/Sanket-Setu) using the **Docker SDK**.
 
 
 
195
 
196
+ **Push to the Space:**
197
 
198
  ```bash
199
+ # From repo root
200
+ git push space main
201
  ```
202
 
203
+ HF Spaces automatically builds the Docker image and serves the container on port 7860.
204
 
205
+ **Space Secrets** (HF Space → Settings → Repository secrets):
206
 
207
+ | Secret | Example value |
208
+ |--------|---------------|
209
+ | `CORS_ORIGINS` | `https://sanket-setu.vercel.app,http://localhost:5173` |
210
+ | `PIPELINE_MODE` | `ensemble` |
211
+ | `CONFIDENCE_THRESHOLD` | `0.70` |
212
 
213
+ **Live URLs:**
 
 
 
214
 
215
+ | Endpoint | URL |
216
+ |---|---|
217
+ | Health check | `https://devrajsinh2012-sanket-setu.hf.space/health` |
218
+ | WebSocket | `wss://devrajsinh2012-sanket-setu.hf.space/ws/landmarks` |
219
 
220
+ ### Frontend — Vercel
221
 
222
+ Connect the GitHub repository in the [Vercel dashboard](https://vercel.com) and add the **Environment Variable**:
223
+
224
+ | Variable | Value |
225
+ |---|---|
226
+ | `VITE_WS_URL` | `wss://devrajsinh2012-sanket-setu.hf.space` |
227
+
228
+ Vercel auto-deploys on every push to `main`.
229
+
230
+ ---
231
+
232
+ ## 🔧 Environment Variables
233
+
234
+ | Variable | Scope | Default | Description |
235
+ |---|---|---|---|
236
+ | `VITE_WS_URL` | Frontend (build-time) | — | WebSocket URL of the backend |
237
+ | `CORS_ORIGINS` | Backend (runtime) | `*` | Comma-separated allowed origins |
238
+ | `PIPELINE_MODE` | Backend (runtime) | `ensemble` | `ensemble` / `A` / `B` / `C` |
239
+ | `CONFIDENCE_THRESHOLD` | Backend (runtime) | `0.70` | Primary confidence cutoff |
240
+ | `SECONDARY_THRESHOLD` | Backend (runtime) | `0.60` | Secondary confidence cutoff |
241
+ | `WEIGHTS_DIR` | Backend (runtime) | repo root | Override path to model weight files |
242
+
243
+ ---
244
+
245
+ ## 🤝 Contributing
246
+
247
+ Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate.
248
+
249
+ ---
250
+
251
+ ## 👥 Team & Acknowledgements
252
+
253
+ This project was developed by:
254
+
255
+ | Name | Contribution |
256
+ |---|---|
257
+ | **Devrajsinh Gohil** | Full-stack development, ML integration, deployment |
258
+ | **Jay Nasit** | Machine learning models, dataset preparation, testing |
259
+
260
+ **Guided by:** Dr. Om Prakash Suthar
261
 
262
+ > We express our sincere gratitude to **Dr. Om Prakash Suthar** for his invaluable guidance, encouragement, and technical insights throughout the development of SanketSetu. His mentorship was instrumental in shaping both the research direction and the system architecture of this project.
263
 
264
+ ---
265
 
266
+ ## 📄 License
267
 
268
+ © 2026 Devrajsinh Gohil & Jay Nasit. All Rights Reserved.
frontend/src/components/Calibration.tsx CHANGED
@@ -39,67 +39,79 @@ export function Calibration({ handDetected, onReady }: Props) {
39
  const isChecked = done;
40
 
41
  return (
42
- <div
43
- className="fixed inset-0 z-40 flex flex-col items-center justify-center gap-8"
44
- style={{ background: 'rgba(5,8,22,0.92)', backdropFilter: 'blur(12px)' }}
45
- >
46
- <motion.h1
47
- initial={{ opacity: 0, y: -20 }}
48
  animate={{ opacity: 1, y: 0 }}
49
- className="text-3xl font-bold glow-text"
 
 
 
 
 
 
50
  >
51
- Ready?
52
- </motion.h1>
 
 
 
 
 
53
 
54
- <motion.p
55
- initial={{ opacity: 0 }}
56
- animate={{ opacity: 1 }}
57
- transition={{ delay: 0.2 }}
58
- className="text-slate-400 text-center max-w-xs"
59
- >
60
- {handDetected
61
- ? 'Hand detected — hold steady…'
62
- : 'Please show your hand to the camera.'}
63
- </motion.p>
64
 
65
- {/* Circular progress / check */}
66
- <div className="relative w-24 h-24 flex items-center justify-center">
67
- <svg className="absolute inset-0 w-full h-full -rotate-90" viewBox="0 0 96 96">
68
- <circle cx="48" cy="48" r="40" fill="none" stroke="rgba(255,255,255,0.06)" strokeWidth="6" />
69
- <motion.circle
70
- cx="48" cy="48" r="40"
71
- fill="none"
72
- stroke="#00f5d4"
73
- strokeWidth="6"
74
- strokeLinecap="round"
75
- strokeDasharray={`${2 * Math.PI * 40}`}
76
- animate={{ strokeDashoffset: 2 * Math.PI * 40 * (1 - progress / 100) }}
77
- transition={{ duration: 0.1 }}
78
- style={{ filter: 'drop-shadow(0 0 6px #00f5d4)' }}
79
- />
80
- </svg>
81
 
82
- <AnimatePresence mode="wait">
83
- {isChecked ? (
84
- <motion.div
85
- key="check"
86
- initial={{ scale: 0, opacity: 0 }}
87
- animate={{ scale: 1, opacity: 1 }}
88
- transition={{ type: 'spring', stiffness: 400, damping: 20 }}
89
- >
90
- <CheckCircle size={40} style={{ color: '#00f5d4' }} />
91
- </motion.div>
92
- ) : (
93
- <motion.div key="spinner" animate={{ rotate: 360 }} transition={{ duration: 1.5, repeat: Infinity, ease: 'linear' }}>
94
- <Loader2 size={32} strokeWidth={1.5} style={{ color: handDetected ? '#00f5d4' : '#4b5563' }} />
95
- </motion.div>
96
- )}
97
- </AnimatePresence>
98
- </div>
99
 
100
- <p className="text-xs text-slate-600">
101
- {isChecked ? 'All set!' : 'Keep your hand visible for 1 second'}
102
- </p>
 
103
  </div>
104
  );
105
  }
 
39
  const isChecked = done;
40
 
41
  return (
42
+ /* Transparent outer overlay — camera feed stays fully visible behind */
43
+ <div className="fixed inset-0 z-40 flex flex-col items-center justify-end pb-10 pointer-events-none">
44
+
45
+ {/* Compact card anchored at the bottom so the camera is unobstructed */}
46
+ <motion.div
47
+ initial={{ opacity: 0, y: 30 }}
48
  animate={{ opacity: 1, y: 0 }}
49
+ className="pointer-events-auto flex flex-col items-center gap-5 px-8 py-6 rounded-2xl"
50
+ style={{
51
+ background: 'rgba(5,8,22,0.88)',
52
+ backdropFilter: 'blur(14px)',
53
+ border: '1px solid rgba(255,255,255,0.08)',
54
+ boxShadow: '0 8px 32px rgba(0,0,0,0.5)',
55
+ }}
56
  >
57
+ <motion.h2
58
+ initial={{ opacity: 0, y: -10 }}
59
+ animate={{ opacity: 1, y: 0 }}
60
+ className="text-2xl font-bold glow-text"
61
+ >
62
+ Ready?
63
+ </motion.h2>
64
 
65
+ <motion.p
66
+ initial={{ opacity: 0 }}
67
+ animate={{ opacity: 1 }}
68
+ transition={{ delay: 0.2 }}
69
+ className="text-slate-400 text-center text-sm max-w-xs"
70
+ >
71
+ {handDetected
72
+ ? 'Hand detected — hold steady…'
73
+ : 'Show your hand to the camera above.'}
74
+ </motion.p>
75
 
76
+ {/* Circular progress / check */}
77
+ <div className="relative w-20 h-20 flex items-center justify-center">
78
+ <svg className="absolute inset-0 w-full h-full -rotate-90" viewBox="0 0 96 96">
79
+ <circle cx="48" cy="48" r="40" fill="none" stroke="rgba(255,255,255,0.06)" strokeWidth="6" />
80
+ <motion.circle
81
+ cx="48" cy="48" r="40"
82
+ fill="none"
83
+ stroke="#00f5d4"
84
+ strokeWidth="6"
85
+ strokeLinecap="round"
86
+ strokeDasharray={`${2 * Math.PI * 40}`}
87
+ animate={{ strokeDashoffset: 2 * Math.PI * 40 * (1 - progress / 100) }}
88
+ transition={{ duration: 0.1 }}
89
+ style={{ filter: 'drop-shadow(0 0 6px #00f5d4)' }}
90
+ />
91
+ </svg>
92
 
93
+ <AnimatePresence mode="wait">
94
+ {isChecked ? (
95
+ <motion.div
96
+ key="check"
97
+ initial={{ scale: 0, opacity: 0 }}
98
+ animate={{ scale: 1, opacity: 1 }}
99
+ transition={{ type: 'spring', stiffness: 400, damping: 20 }}
100
+ >
101
+ <CheckCircle size={36} style={{ color: '#00f5d4' }} />
102
+ </motion.div>
103
+ ) : (
104
+ <motion.div key="spinner" animate={{ rotate: 360 }} transition={{ duration: 1.5, repeat: Infinity, ease: 'linear' }}>
105
+ <Loader2 size={28} strokeWidth={1.5} style={{ color: handDetected ? '#00f5d4' : '#4b5563' }} />
106
+ </motion.div>
107
+ )}
108
+ </AnimatePresence>
109
+ </div>
110
 
111
+ <p className="text-xs text-slate-500">
112
+ {isChecked ? 'All set!' : 'Keep your hand visible for 1 second'}
113
+ </p>
114
+ </motion.div>
115
  </div>
116
  );
117
  }