Abdelkader HASSINE commited on
Commit
ff03012
·
1 Parent(s): 3c51612

Deploy CU1-X to Hugging Face Spaces

Browse files

- Multi-model AI pipeline (RF-DETR, CLIP, OCR, BLIP)
- Unified API architecture
- Gradio web interface
- Full model weights included via Git LFS
- Ready for production deployment

.gitattributes CHANGED
@@ -1,35 +1,21 @@
1
- *.7z filter=lfs diff=lfs merge=lfs -text
2
- *.arrow filter=lfs diff=lfs merge=lfs -text
 
 
 
3
  *.bin filter=lfs diff=lfs merge=lfs -text
4
- *.bz2 filter=lfs diff=lfs merge=lfs -text
5
- *.ckpt filter=lfs diff=lfs merge=lfs -text
6
- *.ftz filter=lfs diff=lfs merge=lfs -text
7
- *.gz filter=lfs diff=lfs merge=lfs -text
8
- *.h5 filter=lfs diff=lfs merge=lfs -text
9
- *.joblib filter=lfs diff=lfs merge=lfs -text
10
- *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
- *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
- *.model filter=lfs diff=lfs merge=lfs -text
13
- *.msgpack filter=lfs diff=lfs merge=lfs -text
14
- *.npy filter=lfs diff=lfs merge=lfs -text
15
- *.npz filter=lfs diff=lfs merge=lfs -text
16
  *.onnx filter=lfs diff=lfs merge=lfs -text
17
- *.ot filter=lfs diff=lfs merge=lfs -text
18
- *.parquet filter=lfs diff=lfs merge=lfs -text
19
  *.pb filter=lfs diff=lfs merge=lfs -text
20
- *.pickle filter=lfs diff=lfs merge=lfs -text
21
  *.pkl filter=lfs diff=lfs merge=lfs -text
22
- *.pt filter=lfs diff=lfs merge=lfs -text
23
- *.pth filter=lfs diff=lfs merge=lfs -text
24
- *.rar filter=lfs diff=lfs merge=lfs -text
25
  *.safetensors filter=lfs diff=lfs merge=lfs -text
26
- saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
- *.tar.* filter=lfs diff=lfs merge=lfs -text
28
- *.tar filter=lfs diff=lfs merge=lfs -text
29
- *.tflite filter=lfs diff=lfs merge=lfs -text
30
- *.tgz filter=lfs diff=lfs merge=lfs -text
31
- *.wasm filter=lfs diff=lfs merge=lfs -text
32
- *.xz filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
- *.zst filter=lfs diff=lfs merge=lfs -text
35
- *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
1
+ # Git LFS configuration for large files
2
+ # Required for Hugging Face Spaces deployment
3
+ # Model weights
4
+ *.pth filter=lfs diff=lfs merge=lfs -text
5
+ *.pt filter=lfs diff=lfs merge=lfs -text
6
  *.bin filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
 
 
 
 
7
  *.onnx filter=lfs diff=lfs merge=lfs -text
 
 
8
  *.pb filter=lfs diff=lfs merge=lfs -text
9
+ # Pickle files (model serialization)
10
  *.pkl filter=lfs diff=lfs merge=lfs -text
11
+ *.pickle filter=lfs diff=lfs merge=lfs -text
12
+ # Other large model formats
13
+ *.h5 filter=lfs diff=lfs merge=lfs -text
14
  *.safetensors filter=lfs diff=lfs merge=lfs -text
15
+ # Archives
16
+ *.tar.gz filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
17
  *.zip filter=lfs diff=lfs merge=lfs -text
18
+ *.tar filter=lfs diff=lfs merge=lfs -text
19
+ # Large data files
20
+ *.parquet filter=lfs diff=lfs merge=lfs -text
21
+ *.arrow filter=lfs diff=lfs merge=lfs -text
API_USAGE.md ADDED
@@ -0,0 +1,233 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 🔌 API Usage Guide - Hugging Face Spaces
2
+
3
+ Sur Hugging Face Spaces, **seul Gradio est exposé publiquement**. L'API FastAPI (port 8000) n'est pas accessible depuis l'extérieur.
4
+
5
+ **Mais Gradio expose automatiquement une API REST native!** 🎉
6
+
7
+ ## 📡 Accéder à l'API depuis l'extérieur
8
+
9
+ ### Option 1: API Gradio Native (Recommandé)
10
+
11
+ Gradio expose automatiquement une API REST à l'endpoint `/api/predict`.
12
+
13
+ #### Python avec `gradio_client`:
14
+
15
+ ```python
16
+ from gradio_client import Client
17
+
18
+ # Remplacez par votre Space URL
19
+ client = Client("AI-DrivenTesting/CU1-X")
20
+
21
+ # Appeler l'API
22
+ result = client.predict(
23
+ "screenshot.png", # image (filepath or PIL Image)
24
+ 0.35, # confidence_threshold (float)
25
+ 2, # thickness (int)
26
+ True, # enable_clip (bool)
27
+ True, # enable_ocr (bool)
28
+ False, # enable_blip (bool)
29
+ False, # ocr_only (bool)
30
+ "Only image & button", # blip_scope (str)
31
+ False, # preprocess (bool)
32
+ "RF-DETR Optimized (Recommended)", # preprocess_mode (str)
33
+ "standard", # preprocess_preset (str)
34
+ api_name="/predict"
35
+ )
36
+
37
+ # Résultat: (annotated_image, summary, detections_json)
38
+ annotated_image, summary, detections_json = result
39
+ print(detections_json)
40
+ ```
41
+
42
+ #### REST API (curl):
43
+
44
+ ```bash
45
+ # Pour un Space public
46
+ curl -X POST "https://AI-DrivenTesting-CU1-X.hf.space/api/predict" \
47
+ -H "Content-Type: application/json" \
48
+ -d '{
49
+ "data": [
50
+ "screenshot.png", # Base64 encoded image or URL
51
+ 0.35,
52
+ 2,
53
+ true,
54
+ true,
55
+ false,
56
+ false,
57
+ "Only image & button",
58
+ false,
59
+ "RF-DETR Optimized (Recommended)",
60
+ "standard"
61
+ ]
62
+ }'
63
+ ```
64
+
65
+ **Note:** Pour les images, vous devez soit:
66
+ - Utiliser une URL publique vers l'image
67
+ - Encoder l'image en base64
68
+ - Utiliser `gradio_client` qui gère ça automatiquement
69
+
70
+ #### REST API avec Python `requests`:
71
+
72
+ ```python
73
+ import requests
74
+ import base64
75
+ from PIL import Image
76
+ import io
77
+
78
+ # Encoder l'image en base64
79
+ def image_to_base64(image_path):
80
+ with open(image_path, "rb") as f:
81
+ return base64.b64encode(f.read()).decode()
82
+
83
+ # Appeler l'API
84
+ url = "https://AI-DrivenTesting-CU1-X.hf.space/api/predict"
85
+ image_b64 = image_to_base64("screenshot.png")
86
+
87
+ response = requests.post(
88
+ url,
89
+ json={
90
+ "data": [
91
+ f"data:image/png;base64,{image_b64}",
92
+ 0.35,
93
+ 2,
94
+ True,
95
+ True,
96
+ False,
97
+ False,
98
+ "Only image & button",
99
+ False,
100
+ "RF-DETR Optimized (Recommended)",
101
+ "standard"
102
+ ]
103
+ },
104
+ timeout=120
105
+ )
106
+
107
+ result = response.json()
108
+ print(result)
109
+ ```
110
+
111
+ ### Option 2: API FastAPI (Interne uniquement)
112
+
113
+ L'API FastAPI sur le port 8000 **n'est PAS accessible depuis l'extérieur** du Space HF.
114
+
115
+ Elle fonctionne uniquement:
116
+ - ✅ En local (`python app.py`)
117
+ - ✅ Entre les processus internes du Space
118
+ - ❌ **PAS depuis l'extérieur du Space**
119
+
120
+ ## 🔑 Authentification
121
+
122
+ ### Spaces Publics
123
+ - Aucune authentification requise
124
+ - API accessible directement
125
+
126
+ ### Spaces Privés
127
+ - Nécessite un token Hugging Face
128
+ - Ajoutez le header: `Authorization: Bearer <HF_TOKEN>`
129
+
130
+ ```python
131
+ from gradio_client import Client
132
+
133
+ client = Client(
134
+ "AI-DrivenTesting/CU1-X",
135
+ hf_token="your_hf_token_here" # Pour les Spaces privés
136
+ )
137
+ ```
138
+
139
+ ## 📊 Paramètres de l'API
140
+
141
+ | Paramètre | Type | Description | Valeur par défaut |
142
+ |-----------|------|-------------|-------------------|
143
+ | `image` | file/str | Image à analyser | - |
144
+ | `confidence_threshold` | float | Seuil de confiance (0.1-0.9) | 0.35 |
145
+ | `thickness` | int | Épaisseur des boîtes (1-6) | 2 |
146
+ | `enable_clip` | bool | Activer classification CLIP | False |
147
+ | `enable_ocr` | bool | Activer extraction OCR | True |
148
+ | `enable_blip` | bool | Activer descriptions BLIP | False |
149
+ | `ocr_only` | bool | Mode OCR seul (skip detection) | False |
150
+ | `blip_scope` | str | Portée BLIP ("Only image & button" ou "All elements") | "Only image & button" |
151
+ | `preprocess` | bool | Activer preprocessing | False |
152
+ | `preprocess_mode` | str | Mode preprocessing | "RF-DETR Optimized (Recommended)" |
153
+ | `preprocess_preset` | str | Preset preprocessing | "standard" |
154
+
155
+ ## 📝 Format de Réponse
156
+
157
+ ```json
158
+ {
159
+ "annotated_image": "base64_encoded_image",
160
+ "summary": "Markdown summary text",
161
+ "detections_json": {
162
+ "success": true,
163
+ "detections": [...],
164
+ "total_detections": 10,
165
+ "image_size": {"width": 1080, "height": 1920},
166
+ "parameters": {...},
167
+ "type_distribution": {...}
168
+ }
169
+ }
170
+ ```
171
+
172
+ ## 🚀 Exemples Complets
173
+
174
+ ### Exemple 1: Détection Simple
175
+
176
+ ```python
177
+ from gradio_client import Client
178
+
179
+ client = Client("AI-DrivenTesting/CU1-X")
180
+
181
+ result = client.predict(
182
+ "screenshot.png",
183
+ 0.35, 2, False, True, False, False, "Only image & button",
184
+ False, "RF-DETR Optimized (Recommended)", "standard",
185
+ api_name="/predict"
186
+ )
187
+
188
+ annotated_image, summary, detections = result
189
+ print(f"Found {detections['total_detections']} elements")
190
+ ```
191
+
192
+ ### Exemple 2: Détection Complète avec CLIP
193
+
194
+ ```python
195
+ result = client.predict(
196
+ "screenshot.png",
197
+ 0.35, 2, True, True, False, False, "Only image & button",
198
+ False, "RF-DETR Optimized (Recommended)", "standard",
199
+ api_name="/predict"
200
+ )
201
+ ```
202
+
203
+ ### Exemple 3: OCR Seulement
204
+
205
+ ```python
206
+ result = client.predict(
207
+ "screenshot.png",
208
+ 0.35, 2, False, True, False, True, "Only image & button",
209
+ False, "RF-DETR Optimized (Recommended)", "standard",
210
+ api_name="/predict"
211
+ )
212
+ ```
213
+
214
+ ## ⚠️ Limitations HF Spaces
215
+
216
+ 1. **Timeout:** 60 secondes par défaut (peut être augmenté dans Settings)
217
+ 2. **Mémoire:** Limite selon le hardware choisi
218
+ 3. **CPU/GPU:** Performance dépend du hardware sélectionné
219
+ 4. **API FastAPI:** Non accessible depuis l'extérieur
220
+
221
+ ## 🔗 Liens Utiles
222
+
223
+ - [Gradio Client Docs](https://www.gradio.app/guides/getting-started-with-the-python-client)
224
+ - [HF Spaces API Docs](https://huggingface.co/docs/hub/spaces-sdks-gradio#api-tab)
225
+ - [HF Authentication](https://huggingface.co/docs/hub/security-tokens)
226
+
227
+ ## 💡 Tips
228
+
229
+ - Utilisez `gradio_client` pour une meilleure gestion des images
230
+ - Pour les gros fichiers, utilisez des URLs publiques
231
+ - Activez le preprocessing pour des résultats cohérents sur différents devices
232
+ - Mode OCR-only est plus rapide si vous voulez juste le texte
233
+
DEPLOYMENT.md ADDED
@@ -0,0 +1,164 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 🚀 Guide de Déploiement Hugging Face Spaces
2
+
3
+ ## 📋 Scripts Disponibles
4
+
5
+ ### 1. `check_hf_space.sh` - Vérification Pré-Déploiement
6
+
7
+ Vérifie que tout est prêt avant de déployer:
8
+
9
+ ```bash
10
+ ./check_hf_space.sh
11
+ ```
12
+
13
+ **Vérifie:**
14
+ - ✅ Python version (>= 3.12)
15
+ - ✅ Fichiers requis (app.py, requirements.txt, etc.)
16
+ - ✅ Répertoires requis (detection/, api/, ui/, rfdetr/)
17
+ - ✅ model.pth présent et tracké par Git LFS
18
+ - ✅ Configuration Git LFS
19
+ - ✅ Métadonnées README.md (frontmatter YAML)
20
+ - ✅ requirements.txt complet
21
+ - ✅ Syntaxe Python valide
22
+ - ✅ Configuration Git et remote HF
23
+ - ✅ Connexion Hugging Face CLI
24
+
25
+ ### 2. `deploy_hf_space.sh` - Déploiement Automatique
26
+
27
+ Déploie automatiquement vers Hugging Face Spaces:
28
+
29
+ ```bash
30
+ ./deploy_hf_space.sh
31
+ ```
32
+
33
+ **Fait automatiquement:**
34
+ - ✅ Configure Git LFS pour model.pth
35
+ - ✅ Vérifie/configure le remote HF
36
+ - ✅ Vérifie la connexion HF
37
+ - ✅ Met à jour requirements.txt si nécessaire
38
+ - ✅ Stage tous les fichiers
39
+ - ✅ Commit avec message descriptif
40
+ - ✅ Push vers HF Spaces
41
+ - ✅ Affiche l'URL du Space
42
+
43
+ ## 🎯 Workflow Recommandé
44
+
45
+ ### Étape 1: Vérifier
46
+
47
+ ```bash
48
+ ./check_hf_space.sh
49
+ ```
50
+
51
+ **Résultat attendu:**
52
+ ```
53
+ ✅ All checks passed! Ready to deploy! ✨
54
+ ```
55
+
56
+ ### Étape 2: Déployer
57
+
58
+ ```bash
59
+ ./deploy_hf_space.sh
60
+ ```
61
+
62
+ Le script va:
63
+ 1. Vérifier Git LFS
64
+ 2. Configurer le remote si nécessaire
65
+ 3. Vérifier la connexion HF
66
+ 4. Commit et push
67
+ 5. Afficher l'URL du Space
68
+
69
+ ### Étape 3: Suivre le Build
70
+
71
+ Le script affichera l'URL de votre Space:
72
+ ```
73
+ https://huggingface.co/spaces/YOUR_USERNAME/CU1-X
74
+ ```
75
+
76
+ Cliquez sur **"Logs"** pour voir le build en direct.
77
+
78
+ ## 📡 Accéder à l'API
79
+
80
+ Une fois déployé, votre API est accessible via:
81
+
82
+ ### API Gradio Native
83
+
84
+ ```python
85
+ from gradio_client import Client
86
+
87
+ client = Client("AI-DrivenTesting/CU1-X")
88
+ result = client.predict(
89
+ "screenshot.png",
90
+ 0.35, 2, True, True, False, False, "Only image & button",
91
+ False, "RF-DETR Optimized (Recommended)", "standard",
92
+ api_name="/predict"
93
+ )
94
+ ```
95
+
96
+ **Voir:** `API_USAGE.md` pour plus de détails
97
+
98
+ ## 🔧 Dépannage
99
+
100
+ ### Erreur: "Git LFS not installed"
101
+
102
+ ```bash
103
+ # macOS
104
+ brew install git-lfs
105
+ git lfs install
106
+
107
+ # Linux
108
+ sudo apt install git-lfs
109
+ git lfs install
110
+ ```
111
+
112
+ ### Erreur: "Not logged in"
113
+
114
+ ```bash
115
+ hf login
116
+ # OU
117
+ huggingface-cli login
118
+ ```
119
+
120
+ ### Erreur: "model.pth not tracked by LFS"
121
+
122
+ ```bash
123
+ git lfs track "*.pth"
124
+ git add .gitattributes model.pth
125
+ git commit -m "Add model with LFS"
126
+ ```
127
+
128
+ ### Erreur: "No remote configured"
129
+
130
+ Le script `deploy_hf_space.sh` vous demandera de configurer le remote automatiquement.
131
+
132
+ Ou manuellement:
133
+ ```bash
134
+ git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/CU1-X
135
+ ```
136
+
137
+ ## 📊 Checklist Rapide
138
+
139
+ Avant de déployer:
140
+
141
+ - [ ] `./check_hf_space.sh` passe tous les tests
142
+ - [ ] Git LFS installé et configuré
143
+ - [ ] Connecté à Hugging Face (`hf login`)
144
+ - [ ] model.pth présent (~510MB)
145
+ - [ ] Remote HF configuré
146
+
147
+ Pour déployer:
148
+
149
+ ```bash
150
+ ./deploy_hf_space.sh
151
+ ```
152
+
153
+ ## 🎉 Après le Déploiement
154
+
155
+ Votre Space sera accessible à:
156
+ - **Interface Web:** `https://huggingface.co/spaces/YOUR_USERNAME/CU1-X`
157
+ - **API:** `https://YOUR_USERNAME-CU1-X.hf.space/api/predict`
158
+
159
+ **Temps de build:** 5-10 minutes (première fois)
160
+
161
+ ---
162
+
163
+ **Besoin d'aide?** Consultez `API_USAGE.md` pour utiliser l'API!
164
+
QUICK_DEPLOY.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ⚡ Déploiement Rapide - 2 Commandes
2
+
3
+ ## 🚀 Déployer en 2 Étapes
4
+
5
+ ### 1️⃣ Vérifier que tout est OK
6
+
7
+ ```bash
8
+ ./check_hf_space.sh
9
+ ```
10
+
11
+ **Résultat attendu:** ✅ All checks passed!
12
+
13
+ ### 2️⃣ Déployer vers HF Spaces
14
+
15
+ ```bash
16
+ ./deploy_hf_space.sh
17
+ ```
18
+
19
+ **C'est tout!** 🎉
20
+
21
+ ## 📡 Après le Déploiement
22
+
23
+ Votre Space sera accessible à:
24
+ - **Web UI:** https://huggingface.co/spaces/AI-DrivenTesting/CU1-X
25
+ - **API:** https://AI-DrivenTesting-CU1-X.hf.space/api/predict
26
+
27
+ ## 🔌 Utiliser l'API
28
+
29
+ ```python
30
+ from gradio_client import Client
31
+
32
+ client = Client("AI-DrivenTesting/CU1-X")
33
+ result = client.predict(
34
+ "screenshot.png",
35
+ 0.35, 2, True, True, False, False, "Only image & button",
36
+ False, "RF-DETR Optimized (Recommended)", "standard",
37
+ api_name="/predict"
38
+ )
39
+
40
+ annotated_image, summary, detections = result
41
+ print(detections)
42
+ ```
43
+
44
+ **Voir:** `API_USAGE.md` pour plus d'exemples
45
+
46
+ ## ⏱️ Temps de Build
47
+
48
+ - **Premier build:** 5-10 minutes
49
+ - **Builds suivants:** 2-3 minutes
50
+
51
+ ---
52
+
53
+ **C'est tout! Simple et rapide! 🚀**
54
+
README.md CHANGED
@@ -1,12 +1,24 @@
1
  ---
2
- title: "CU1-X UI Element Detector"
3
- emoji: "🧠"
4
- colorFrom: "blue"
5
- colorTo: "purple"
6
- sdk: "gradio"
7
- sdk_version: "4.44.1"
8
- app_file: "app.py"
9
  pinned: false
 
 
 
 
 
 
 
 
 
 
 
 
10
  ---
11
 
12
  # CU-1 UI Element Detector
@@ -382,26 +394,44 @@ Detection performance depends on enabled features:
382
 
383
  ## 🤗 Deploying to Hugging Face Spaces
384
 
385
- ### Quick Deploy
 
 
 
 
 
 
 
 
 
 
 
 
386
 
387
  1. **Create a new Space** on Hugging Face
388
  - Choose "Gradio" as SDK
389
  - Select hardware (CPU or GPU)
390
 
391
- 2. **Upload these files:**
392
  ```bash
393
- app.py # Unified entry point (API + UI)
394
- app_api.py # API server (launched by app.py)
395
- requirements.txt # Dependencies
396
- detection/ # Detection modules
397
- api/ # API endpoints
398
- ui/ # UI components
399
- model.pth # Model weights
400
- README.md # Documentation
401
  ```
402
 
403
  3. **Space will auto-deploy** - First run takes 5-10 minutes (model download)
404
 
 
 
 
 
 
 
405
  ### Unified Architecture
406
 
407
  **NEW:** `app.py` now uses the same unified API architecture everywhere:
 
1
  ---
2
+ title: CU1-X UI Element Detector
3
+ emoji: 🧠
4
+ colorFrom: blue
5
+ colorTo: purple
6
+ sdk: gradio
7
+ sdk_version: 4.44.1
8
+ app_file: app.py
9
  pinned: false
10
+ python_version: 3.12
11
+ models:
12
+ - Roboflow/RF-DETR
13
+ - openai/clip-vit-base-patch32
14
+ - Salesforce/blip-image-captioning-base
15
+ tags:
16
+ - computer-vision
17
+ - object-detection
18
+ - ui-elements
19
+ - ocr
20
+ - transformers
21
+ license: mit
22
  ---
23
 
24
  # CU-1 UI Element Detector
 
394
 
395
  ## 🤗 Deploying to Hugging Face Spaces
396
 
397
+ ### 🚀 Quick Deploy (2 Commands)
398
+
399
+ **Option 1: Scripts Automatiques (Recommandé)**
400
+
401
+ ```bash
402
+ # 1. Vérifier que tout est prêt
403
+ ./check_hf_space.sh
404
+
405
+ # 2. Déployer automatiquement
406
+ ./deploy_hf_space.sh
407
+ ```
408
+
409
+ **Option 2: Manuel**
410
 
411
  1. **Create a new Space** on Hugging Face
412
  - Choose "Gradio" as SDK
413
  - Select hardware (CPU or GPU)
414
 
415
+ 2. **Clone and push:**
416
  ```bash
417
+ git clone https://huggingface.co/spaces/YOUR_USERNAME/CU1-X
418
+ cd CU1-X
419
+ # Copy files from your project
420
+ git lfs install
421
+ git lfs track "*.pth"
422
+ git add .
423
+ git commit -m "Initial deployment"
424
+ git push origin main
425
  ```
426
 
427
  3. **Space will auto-deploy** - First run takes 5-10 minutes (model download)
428
 
429
+ ### 📚 Documentation
430
+
431
+ - **[QUICK_DEPLOY.md](QUICK_DEPLOY.md)** - Guide ultra-rapide (2 commandes)
432
+ - **[DEPLOYMENT.md](DEPLOYMENT.md)** - Guide détaillé complet
433
+ - **[API_USAGE.md](API_USAGE.md)** - Comment utiliser l'API depuis l'extérieur
434
+
435
  ### Unified Architecture
436
 
437
  **NEW:** `app.py` now uses the same unified API architecture everywhere:
README_DEPLOYMENT.md ADDED
@@ -0,0 +1,81 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 📦 Résumé - Fichiers de Déploiement HF Spaces
2
+
3
+ ## ✅ Fichiers Créés
4
+
5
+ ### 🚀 Scripts de Déploiement
6
+
7
+ 1. **`check_hf_space.sh`** - Script de vérification pré-déploiement
8
+ - Vérifie 10 points critiques
9
+ - Affiche warnings et erreurs
10
+ - Exit code 0 si OK, 1 si erreurs
11
+
12
+ 2. **`deploy_hf_space.sh`** - Script de déploiement automatique
13
+ - Configure Git LFS automatiquement
14
+ - Vérifie/configure remote HF
15
+ - Commit et push vers HF Spaces
16
+ - Affiche l'URL du Space
17
+
18
+ ### 📚 Documentation
19
+
20
+ 1. **`API_USAGE.md`** - Guide complet d'utilisation de l'API
21
+ - Comment utiliser l'API Gradio native
22
+ - Exemples Python et REST
23
+ - Paramètres et format de réponse
24
+
25
+ 2. **`DEPLOYMENT.md`** - Guide de déploiement détaillé
26
+ - Workflow étape par étape
27
+ - Dépannage
28
+ - Checklist
29
+
30
+ 3. **`QUICK_DEPLOY.md`** - Guide ultra-rapide
31
+ - 2 commandes pour déployer
32
+ - Exemple API rapide
33
+
34
+ ### 📝 Configuration
35
+
36
+ 1. **`requirements-full.txt`** - Toutes les dépendances
37
+ 2. **`requirements.txt`** - Copie de requirements-full.txt (pour HF)
38
+ 3. **`.gitattributes`** - Configuration Git LFS pour *.pth
39
+ 4. **`README.md`** - Mis à jour avec métadonnées HF Spaces
40
+
41
+ ### 💡 Exemples
42
+
43
+ 1. **`examples/api_example.py`** - Exemple Python d'utilisation de l'API
44
+
45
+ ## 🎯 Utilisation
46
+
47
+ ### Vérifier avant déploiement:
48
+ ```bash
49
+ ./check_hf_space.sh
50
+ ```
51
+
52
+ ### Déployer:
53
+ ```bash
54
+ ./deploy_hf_space.sh
55
+ ```
56
+
57
+ ## 📊 État Actuel
58
+
59
+ ✅ **Tout est prêt!**
60
+
61
+ - ✅ Tous les fichiers requis présents
62
+ - ✅ model.pth tracké par Git LFS
63
+ - ✅ Git LFS configuré
64
+ - ✅ README.md avec métadonnées HF
65
+ - ✅ requirements.txt complet
66
+ - ✅ Remote HF configuré
67
+ - ✅ Connecté à Hugging Face
68
+
69
+ **Prochaine étape:** `./deploy_hf_space.sh`
70
+
71
+ ## 🔗 URLs Importantes
72
+
73
+ Une fois déployé:
74
+ - **Space:** https://huggingface.co/spaces/AI-DrivenTesting/CU1-X
75
+ - **API:** https://AI-DrivenTesting-CU1-X.hf.space/api/predict
76
+ - **Logs:** https://huggingface.co/spaces/AI-DrivenTesting/CU1-X/logs
77
+
78
+ ---
79
+
80
+ **Tout est prêt pour le déploiement! 🚀**
81
+
app.py CHANGED
@@ -141,11 +141,13 @@ def main():
141
  print(f"\n🎨 Starting Gradio UI on http://localhost:{UI_PORT}...\n")
142
 
143
  # Launch Gradio with automatic port fallback
 
144
  try:
145
  demo.queue().launch(
146
  server_name=UI_HOST,
147
  server_port=UI_PORT,
148
- share=False
 
149
  )
150
  except OSError as e:
151
  if "Cannot find empty port" in str(e):
 
141
  print(f"\n🎨 Starting Gradio UI on http://localhost:{UI_PORT}...\n")
142
 
143
  # Launch Gradio with automatic port fallback
144
+ # API is automatically exposed at /api/predict for HF Spaces
145
  try:
146
  demo.queue().launch(
147
  server_name=UI_HOST,
148
  server_port=UI_PORT,
149
+ share=False,
150
+ api_name="predict" # Explicitly expose API endpoint
151
  )
152
  except OSError as e:
153
  if "Cannot find empty port" in str(e):
check_hf_space.sh ADDED
@@ -0,0 +1,286 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/bash
2
+ # Script de vérification pour Hugging Face Spaces
3
+ # Vérifie que tout est prêt pour le déploiement
4
+
5
+ set -e
6
+
7
+ # Colors
8
+ RED='\033[0;31m'
9
+ GREEN='\033[0;32m'
10
+ YELLOW='\033[1;33m'
11
+ BLUE='\033[0;34m'
12
+ NC='\033[0m'
13
+
14
+ print_info() { echo -e "${BLUE}ℹ️ $1${NC}"; }
15
+ print_success() { echo -e "${GREEN}✅ $1${NC}"; }
16
+ print_warning() { echo -e "${YELLOW}⚠️ $1${NC}"; }
17
+ print_error() { echo -e "${RED}❌ $1${NC}"; }
18
+
19
+ FAILURES=0
20
+ WARNINGS=0
21
+
22
+ echo ""
23
+ print_info "🔍 Hugging Face Spaces Pre-Deployment Check"
24
+ echo "================================================"
25
+ echo ""
26
+
27
+ # Test 1: Python version
28
+ print_info "Test 1: Python version..."
29
+ PYTHON_VERSION=$(python --version 2>&1 | awk '{print $2}')
30
+ PYTHON_MAJOR=$(echo $PYTHON_VERSION | cut -d. -f1)
31
+ PYTHON_MINOR=$(echo $PYTHON_VERSION | cut -d. -f2)
32
+
33
+ if [ "$PYTHON_MAJOR" -ge 3 ] && [ "$PYTHON_MINOR" -ge 12 ]; then
34
+ print_success "Python $PYTHON_VERSION (>= 3.12)"
35
+ else
36
+ print_warning "Python $PYTHON_VERSION (recommended: >= 3.12)"
37
+ WARNINGS=$((WARNINGS + 1))
38
+ fi
39
+ echo ""
40
+
41
+ # Test 2: Required files
42
+ print_info "Test 2: Required files..."
43
+ REQUIRED_FILES=(
44
+ "app.py"
45
+ "app_api.py"
46
+ "app_ui.py"
47
+ "requirements.txt"
48
+ "README.md"
49
+ ".gitattributes"
50
+ )
51
+
52
+ for file in "${REQUIRED_FILES[@]}"; do
53
+ if [ -f "$file" ]; then
54
+ print_success "$file exists"
55
+ else
56
+ print_error "$file NOT FOUND"
57
+ FAILURES=$((FAILURES + 1))
58
+ fi
59
+ done
60
+ echo ""
61
+
62
+ # Test 3: Required directories
63
+ print_info "Test 3: Required directories..."
64
+ REQUIRED_DIRS=(
65
+ "detection"
66
+ "api"
67
+ "ui"
68
+ "rfdetr"
69
+ )
70
+
71
+ for dir in "${REQUIRED_DIRS[@]}"; do
72
+ if [ -d "$dir" ]; then
73
+ print_success "$dir/ exists"
74
+ else
75
+ print_error "$dir/ NOT FOUND"
76
+ FAILURES=$((FAILURES + 1))
77
+ fi
78
+ done
79
+ echo ""
80
+
81
+ # Test 4: model.pth
82
+ print_info "Test 4: Model weights (model.pth)..."
83
+ if [ -f "model.pth" ]; then
84
+ SIZE=$(du -h model.pth | cut -f1)
85
+ SIZE_BYTES=$(stat -f%z model.pth 2>/dev/null || stat -c%s model.pth)
86
+
87
+ if [ $SIZE_BYTES -gt 100000000 ]; then # > 100MB
88
+ print_success "model.pth exists ($SIZE)"
89
+
90
+ # Check Git LFS
91
+ if git lfs ls-files | grep -q "model.pth"; then
92
+ print_success "model.pth tracked by Git LFS"
93
+ else
94
+ print_warning "model.pth NOT tracked by Git LFS (will fail on push)"
95
+ WARNINGS=$((WARNINGS + 1))
96
+ fi
97
+ else
98
+ print_warning "model.pth size: $SIZE (seems small, verify it's correct)"
99
+ WARNINGS=$((WARNINGS + 1))
100
+ fi
101
+ else
102
+ print_error "model.pth NOT FOUND"
103
+ FAILURES=$((FAILURES + 1))
104
+ fi
105
+ echo ""
106
+
107
+ # Test 5: Git LFS
108
+ print_info "Test 5: Git LFS configuration..."
109
+ if command -v git-lfs &> /dev/null; then
110
+ print_success "Git LFS installed"
111
+
112
+ if git lfs env &> /dev/null; then
113
+ print_success "Git LFS initialized"
114
+ else
115
+ print_warning "Git LFS not initialized"
116
+ WARNINGS=$((WARNINGS + 1))
117
+ fi
118
+
119
+ if grep -q "*.pth.*lfs" .gitattributes 2>/dev/null; then
120
+ print_success ".gitattributes tracks *.pth"
121
+ else
122
+ print_error ".gitattributes doesn't track *.pth"
123
+ FAILURES=$((FAILURES + 1))
124
+ fi
125
+ else
126
+ print_error "Git LFS not installed!"
127
+ print_info " Install: brew install git-lfs (macOS) or sudo apt install git-lfs (Linux)"
128
+ FAILURES=$((FAILURES + 1))
129
+ fi
130
+ echo ""
131
+
132
+ # Test 6: README.md frontmatter
133
+ print_info "Test 6: README.md frontmatter (HF Spaces metadata)..."
134
+ if [ -f "README.md" ]; then
135
+ if head -n 1 README.md | grep -q "^---$"; then
136
+ print_success "README.md has YAML frontmatter"
137
+
138
+ # Check key fields
139
+ if grep -q "^sdk: gradio" README.md; then
140
+ print_success "sdk: gradio found"
141
+ else
142
+ print_warning "sdk: gradio not found"
143
+ WARNINGS=$((WARNINGS + 1))
144
+ fi
145
+
146
+ if grep -q "^app_file: app.py" README.md; then
147
+ print_success "app_file: app.py found"
148
+ else
149
+ print_warning "app_file: app.py not found"
150
+ WARNINGS=$((WARNINGS + 1))
151
+ fi
152
+
153
+ if grep -q "^python_version:" README.md; then
154
+ print_success "python_version specified"
155
+ else
156
+ print_warning "python_version not specified"
157
+ WARNINGS=$((WARNINGS + 1))
158
+ fi
159
+ else
160
+ print_error "README.md missing YAML frontmatter"
161
+ FAILURES=$((FAILURES + 1))
162
+ fi
163
+ else
164
+ print_error "README.md not found"
165
+ FAILURES=$((FAILURES + 1))
166
+ fi
167
+ echo ""
168
+
169
+ # Test 7: requirements.txt
170
+ print_info "Test 7: requirements.txt..."
171
+ if [ -f "requirements.txt" ]; then
172
+ if [ -s "requirements.txt" ]; then
173
+ LINE_COUNT=$(wc -l < requirements.txt)
174
+ if [ $LINE_COUNT -gt 5 ]; then
175
+ print_success "requirements.txt looks complete ($LINE_COUNT lines)"
176
+ else
177
+ print_warning "requirements.txt seems minimal ($LINE_COUNT lines)"
178
+ WARNINGS=$((WARNINGS + 1))
179
+ fi
180
+
181
+ # Check for critical dependencies
182
+ if grep -q "gradio" requirements.txt; then
183
+ print_success "gradio found in requirements.txt"
184
+ else
185
+ print_error "gradio NOT found in requirements.txt"
186
+ FAILURES=$((FAILURES + 1))
187
+ fi
188
+
189
+ if grep -q "torch" requirements.txt; then
190
+ print_success "torch found in requirements.txt"
191
+ else
192
+ print_warning "torch not found (may be needed)"
193
+ WARNINGS=$((WARNINGS + 1))
194
+ fi
195
+ else
196
+ print_error "requirements.txt is empty"
197
+ FAILURES=$((FAILURES + 1))
198
+ fi
199
+ else
200
+ print_error "requirements.txt not found"
201
+ FAILURES=$((FAILURES + 1))
202
+ fi
203
+ echo ""
204
+
205
+ # Test 8: Python syntax
206
+ print_info "Test 8: Python syntax validation..."
207
+ for pyfile in app.py app_api.py app_ui.py; do
208
+ if [ -f "$pyfile" ]; then
209
+ if python -m py_compile "$pyfile" 2>/dev/null; then
210
+ print_success "$pyfile syntax valid"
211
+ else
212
+ print_error "$pyfile has syntax errors"
213
+ FAILURES=$((FAILURES + 1))
214
+ fi
215
+ fi
216
+ done
217
+ echo ""
218
+
219
+ # Test 9: Git repository
220
+ print_info "Test 9: Git repository..."
221
+ if [ -d ".git" ]; then
222
+ print_success "Git repository initialized"
223
+
224
+ # Check remote
225
+ if git remote -v | grep -q "huggingface.co"; then
226
+ REMOTE_URL=$(git remote get-url origin 2>/dev/null || echo "unknown")
227
+ print_success "HF Space remote configured: $REMOTE_URL"
228
+ else
229
+ print_warning "No Hugging Face remote configured"
230
+ WARNINGS=$((WARNINGS + 1))
231
+ fi
232
+
233
+ # Check for uncommitted changes
234
+ if [ -n "$(git status --porcelain)" ]; then
235
+ print_warning "Uncommitted changes detected"
236
+ WARNINGS=$((WARNINGS + 1))
237
+ else
238
+ print_success "All changes committed"
239
+ fi
240
+ else
241
+ print_warning "Not a git repository (will need git init)"
242
+ WARNINGS=$((WARNINGS + 1))
243
+ fi
244
+ echo ""
245
+
246
+ # Test 10: Hugging Face CLI
247
+ print_info "Test 10: Hugging Face CLI..."
248
+ if command -v huggingface-cli &> /dev/null || command -v hf &> /dev/null; then
249
+ print_success "Hugging Face CLI installed"
250
+
251
+ # Check login
252
+ if huggingface-cli whoami &> /dev/null 2>&1 || hf auth whoami &> /dev/null 2>&1; then
253
+ USERNAME=$(huggingface-cli whoami 2>/dev/null || hf auth whoami 2>/dev/null | head -n1)
254
+ print_success "Logged in as: $USERNAME"
255
+ else
256
+ print_warning "Not logged in to Hugging Face"
257
+ print_info " Run: huggingface-cli login or hf login"
258
+ WARNINGS=$((WARNINGS + 1))
259
+ fi
260
+ else
261
+ print_warning "Hugging Face CLI not installed"
262
+ print_info " Install: pip install huggingface-hub"
263
+ WARNINGS=$((WARNINGS + 1))
264
+ fi
265
+ echo ""
266
+
267
+ # Summary
268
+ echo "================================================"
269
+ if [ $FAILURES -eq 0 ] && [ $WARNINGS -eq 0 ]; then
270
+ print_success "All checks passed! Ready to deploy! ✨"
271
+ echo ""
272
+ print_info "Next step: Run ./deploy_hf_space.sh"
273
+ exit 0
274
+ elif [ $FAILURES -eq 0 ]; then
275
+ print_warning "$WARNINGS warning(s) found"
276
+ echo ""
277
+ print_info "You can proceed, but consider fixing warnings"
278
+ print_info "Next step: Run ./deploy_hf_space.sh"
279
+ exit 0
280
+ else
281
+ print_error "$FAILURES critical error(s) and $WARNINGS warning(s)"
282
+ echo ""
283
+ print_info "Please fix the errors before deploying"
284
+ exit 1
285
+ fi
286
+
deploy_hf_space.sh ADDED
@@ -0,0 +1,210 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/bin/bash
2
+ # Script de déploiement pour Hugging Face Spaces
3
+ # Build et push le Space vers Hugging Face
4
+
5
+ set -e
6
+
7
+ # Colors
8
+ RED='\033[0;31m'
9
+ GREEN='\033[0;32m'
10
+ YELLOW='\033[1;33m'
11
+ BLUE='\033[0;34m'
12
+ NC='\033[0m'
13
+
14
+ print_info() { echo -e "${BLUE}ℹ️ $1${NC}"; }
15
+ print_success() { echo -e "${GREEN}✅ $1${NC}"; }
16
+ print_warning() { echo -e "${YELLOW}⚠️ $1${NC}"; }
17
+ print_error() { echo -e "${RED}❌ $1${NC}"; }
18
+
19
+ echo ""
20
+ print_info "🚀 Deploying CU1-X to Hugging Face Spaces"
21
+ echo "================================================"
22
+ echo ""
23
+
24
+ # Check if we're in a git repo
25
+ if [ ! -d ".git" ]; then
26
+ print_error "Not a git repository!"
27
+ print_info "Initializing git repository..."
28
+ git init
29
+ print_success "Git repository initialized"
30
+ fi
31
+
32
+ # Check Git LFS
33
+ print_info "Configuring Git LFS..."
34
+ if ! command -v git-lfs &> /dev/null; then
35
+ print_error "Git LFS not installed!"
36
+ print_info "Install with: brew install git-lfs (macOS) or sudo apt install git-lfs (Linux)"
37
+ exit 1
38
+ fi
39
+
40
+ git lfs install > /dev/null 2>&1 || true
41
+
42
+ # Ensure model.pth is tracked
43
+ if [ -f "model.pth" ]; then
44
+ if ! git lfs ls-files | grep -q "model.pth"; then
45
+ print_info "Adding model.pth to Git LFS..."
46
+ git lfs track "*.pth"
47
+ git add .gitattributes
48
+ print_success "model.pth configured for Git LFS"
49
+ else
50
+ print_success "model.pth already tracked by Git LFS"
51
+ fi
52
+ else
53
+ print_error "model.pth not found!"
54
+ exit 1
55
+ fi
56
+
57
+ # Check HF remote
58
+ print_info "Checking Hugging Face remote..."
59
+ if git remote | grep -q "origin"; then
60
+ REMOTE_URL=$(git remote get-url origin 2>/dev/null || echo "")
61
+ if echo "$REMOTE_URL" | grep -q "huggingface.co"; then
62
+ print_success "HF Space remote configured: $REMOTE_URL"
63
+ SPACE_URL=$(echo "$REMOTE_URL" | sed -E 's|.*spaces/([^/]+)/([^/]+).*|\1/\2|')
64
+ print_info "Space URL: https://huggingface.co/spaces/$SPACE_URL"
65
+ else
66
+ print_warning "Remote exists but doesn't look like HF Space"
67
+ print_info "Current remote: $REMOTE_URL"
68
+ fi
69
+ else
70
+ print_warning "No remote configured"
71
+ print_info "You'll need to add a remote:"
72
+ print_info " git remote add origin https://huggingface.co/spaces/YOUR_USERNAME/YOUR_SPACE_NAME"
73
+ read -p "Do you want to configure it now? (y/n) " -n 1 -r
74
+ echo
75
+ if [[ $REPLY =~ ^[Yy]$ ]]; then
76
+ read -p "Enter your HF username: " HF_USERNAME
77
+ read -p "Enter your Space name: " SPACE_NAME
78
+ git remote add origin "https://huggingface.co/spaces/$HF_USERNAME/$SPACE_NAME"
79
+ print_success "Remote configured"
80
+ SPACE_URL="$HF_USERNAME/$SPACE_NAME"
81
+ else
82
+ print_error "Cannot deploy without remote"
83
+ exit 1
84
+ fi
85
+ fi
86
+
87
+ # Check login
88
+ print_info "Checking Hugging Face login..."
89
+ if command -v hf &> /dev/null; then
90
+ if hf auth whoami &> /dev/null 2>&1; then
91
+ USERNAME=$(hf auth whoami 2>/dev/null | head -n1)
92
+ print_success "Logged in as: $USERNAME"
93
+ else
94
+ print_warning "Not logged in"
95
+ print_info "Logging in..."
96
+ hf login
97
+ fi
98
+ elif command -v huggingface-cli &> /dev/null; then
99
+ if huggingface-cli whoami &> /dev/null 2>&1; then
100
+ USERNAME=$(huggingface-cli whoami 2>/dev/null | head -n1)
101
+ print_success "Logged in as: $USERNAME"
102
+ else
103
+ print_warning "Not logged in"
104
+ print_info "Logging in..."
105
+ huggingface-cli login
106
+ fi
107
+ else
108
+ print_error "Hugging Face CLI not found!"
109
+ print_info "Install: pip install huggingface-hub"
110
+ exit 1
111
+ fi
112
+
113
+ # Ensure requirements.txt is complete
114
+ print_info "Checking requirements.txt..."
115
+ if [ -f "requirements-full.txt" ] && [ -f "requirements.txt" ]; then
116
+ FULL_LINES=$(wc -l < requirements-full.txt)
117
+ CURRENT_LINES=$(wc -l < requirements.txt)
118
+
119
+ if [ $CURRENT_LINES -lt $FULL_LINES ]; then
120
+ print_warning "requirements.txt seems incomplete"
121
+ read -p "Use requirements-full.txt? (y/n) " -n 1 -r
122
+ echo
123
+ if [[ $REPLY =~ ^[Yy]$ ]]; then
124
+ cp requirements-full.txt requirements.txt
125
+ print_success "Updated requirements.txt from requirements-full.txt"
126
+ fi
127
+ fi
128
+ fi
129
+
130
+ # Stage all files
131
+ print_info "Staging files..."
132
+ git add .
133
+ print_success "Files staged"
134
+
135
+ # Check if there are changes
136
+ if [ -z "$(git status --porcelain)" ]; then
137
+ print_warning "No changes to commit"
138
+ print_info "Everything is already up to date"
139
+ else
140
+ # Show what will be committed
141
+ print_info "Changes to commit:"
142
+ git status --short
143
+
144
+ # Commit
145
+ print_info "Creating commit..."
146
+ COMMIT_MSG="Deploy CU1-X to Hugging Face Spaces
147
+
148
+ - Multi-model AI pipeline (RF-DETR, CLIP, OCR, BLIP)
149
+ - Unified API architecture
150
+ - Gradio web interface
151
+ - Full model weights included via Git LFS
152
+ - Ready for production deployment"
153
+
154
+ git commit -m "$COMMIT_MSG" || {
155
+ print_error "Commit failed"
156
+ exit 1
157
+ }
158
+ print_success "Changes committed"
159
+ fi
160
+
161
+ # Push to Hugging Face
162
+ print_info "Pushing to Hugging Face Spaces..."
163
+ print_warning "This may take several minutes (model.pth is 510MB)..."
164
+ echo ""
165
+
166
+ BRANCH=$(git branch --show-current 2>/dev/null || echo "main")
167
+
168
+ if git push -u origin "$BRANCH" 2>&1 | tee /tmp/hf_push.log; then
169
+ print_success "Push completed successfully!"
170
+ echo ""
171
+ echo "================================================"
172
+ print_success "🎉 Deployment Successful!"
173
+ echo "================================================"
174
+ echo ""
175
+
176
+ if [ -n "$SPACE_URL" ]; then
177
+ print_info "Your Space is deploying at:"
178
+ echo " https://huggingface.co/spaces/$SPACE_URL"
179
+ echo ""
180
+ print_info "Build progress:"
181
+ echo " https://huggingface.co/spaces/$SPACE_URL/logs"
182
+ echo ""
183
+ print_info "Once deployed, your app will be at:"
184
+ echo " https://huggingface.co/spaces/$SPACE_URL"
185
+ echo ""
186
+ print_info "API endpoint:"
187
+ echo " https://$SPACE_URL.hf.space/api/predict"
188
+ echo ""
189
+ fi
190
+
191
+ print_warning "First build may take 5-10 minutes"
192
+ print_info "HF Spaces will automatically:"
193
+ print_info " - Install dependencies from requirements.txt"
194
+ print_info " - Download models (CLIP, BLIP, EasyOCR)"
195
+ print_info " - Start app.py"
196
+ print_info " - Expose Gradio interface and API"
197
+ echo ""
198
+ print_success "All done! 🎉"
199
+ else
200
+ print_error "Push failed!"
201
+ echo ""
202
+ print_info "Common issues:"
203
+ print_info "1. Authentication failed: Run 'hf login' or 'huggingface-cli login'"
204
+ print_info "2. Git LFS error: Ensure Git LFS is installed and model.pth is tracked"
205
+ print_info "3. Network error: Check your internet connection"
206
+ echo ""
207
+ print_info "Check the error above for details"
208
+ exit 1
209
+ fi
210
+
examples/api_example.py ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Example: Using CU1-X API from Hugging Face Space
3
+
4
+ This example shows how to call the CU1-X API deployed on Hugging Face Spaces.
5
+ """
6
+
7
+ from gradio_client import Client
8
+ import json
9
+
10
+ # Configuration
11
+ SPACE_URL = "AI-DrivenTesting/CU1-X" # Remplacez par votre Space URL
12
+
13
+ def detect_ui_elements(image_path: str):
14
+ """
15
+ Détecte les éléments UI dans une image via l'API HF Space
16
+
17
+ Args:
18
+ image_path: Chemin vers l'image à analyser
19
+
20
+ Returns:
21
+ Tuple (annotated_image, summary, detections_json)
22
+ """
23
+ # Créer le client Gradio
24
+ client = Client(SPACE_URL)
25
+
26
+ # Appeler l'API
27
+ result = client.predict(
28
+ image_path, # image
29
+ 0.35, # confidence_threshold
30
+ 2, # thickness
31
+ True, # enable_clip (classification)
32
+ True, # enable_ocr (extraction texte)
33
+ False, # enable_blip (descriptions)
34
+ False, # ocr_only
35
+ "Only image & button", # blip_scope
36
+ False, # preprocess
37
+ "RF-DETR Optimized (Recommended)", # preprocess_mode
38
+ "standard", # preprocess_preset
39
+ api_name="/predict"
40
+ )
41
+
42
+ # Déballer les résultats
43
+ annotated_image, summary, detections_json = result
44
+
45
+ return annotated_image, summary, detections_json
46
+
47
+
48
+ def main():
49
+ """Exemple d'utilisation"""
50
+
51
+ print("🚀 CU1-X API Example")
52
+ print("=" * 50)
53
+
54
+ # Chemin vers votre image de test
55
+ test_image = "screenshot.png" # Remplacez par votre image
56
+
57
+ try:
58
+ print(f"\n📤 Uploading image: {test_image}")
59
+ print("⏳ Processing... (this may take 30-60 seconds)")
60
+
61
+ # Appeler l'API
62
+ annotated_image, summary, detections = detect_ui_elements(test_image)
63
+
64
+ # Afficher les résultats
65
+ print("\n✅ Detection completed!")
66
+ print("\n📊 Summary:")
67
+ print(summary)
68
+
69
+ print("\n🔍 Detections:")
70
+ if isinstance(detections, str):
71
+ detections = json.loads(detections)
72
+
73
+ print(f" Total: {detections.get('total_detections', 0)} elements")
74
+
75
+ if 'type_distribution' in detections:
76
+ print("\n📈 Type Distribution:")
77
+ for elem_type, count in detections['type_distribution'].items():
78
+ print(f" {elem_type}: {count}")
79
+
80
+ print("\n💾 Saving annotated image...")
81
+ # annotated_image est un fichier temporaire, vous pouvez le copier
82
+ print(f" Annotated image saved at: {annotated_image}")
83
+
84
+ except Exception as e:
85
+ print(f"\n❌ Error: {e}")
86
+ print("\nTroubleshooting:")
87
+ print("1. Vérifiez que votre Space est déployé et en ligne")
88
+ print("2. Vérifiez que SPACE_URL est correct")
89
+ print("3. Assurez-vous d'avoir installé: pip install gradio_client")
90
+
91
+
92
+ if __name__ == "__main__":
93
+ main()
94
+
requirements-full.txt ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Full requirements for CU1-X UI Element Detector
2
+ # Use this file for deployment to Hugging Face Spaces or production
3
+
4
+ # Core dependencies
5
+ gradio[oauth]==4.44.1
6
+
7
+ # Deep Learning frameworks
8
+ torch==2.4.1
9
+ torchvision==0.19.1
10
+
11
+ # Computer Vision & Image Processing
12
+ opencv-python-headless==4.10.0.84
13
+ pillow==10.4.0
14
+ numpy==1.26.4
15
+ supervision==0.23.0
16
+
17
+ # OCR & Text Recognition
18
+ easyocr==1.7.1
19
+
20
+ # Transformers & AI Models
21
+ transformers==4.44.2
22
+
23
+ # RF-DETR Detection Model
24
+ rfdetr==1.0.4
25
+
26
+ # API Framework
27
+ fastapi==0.115.0
28
+ uvicorn[standard]==0.30.6
29
+
30
+ # HTTP Clients
31
+ requests==2.32.3
32
+ aiohttp==3.10.5
33
+
34
+ # Testing
35
+ pytest==8.3.3
36
+
37
+ # Utilities
38
+ python-multipart==0.0.9 # For FastAPI file uploads
39
+ python-dotenv==1.0.1 # For environment variables
40
+
requirements.txt CHANGED
@@ -1 +1,40 @@
1
- gradio[oauth]==4.44.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Full requirements for CU1-X UI Element Detector
2
+ # Use this file for deployment to Hugging Face Spaces or production
3
+
4
+ # Core dependencies
5
+ gradio[oauth]==4.44.1
6
+
7
+ # Deep Learning frameworks
8
+ torch==2.4.1
9
+ torchvision==0.19.1
10
+
11
+ # Computer Vision & Image Processing
12
+ opencv-python-headless==4.10.0.84
13
+ pillow==10.4.0
14
+ numpy==1.26.4
15
+ supervision==0.23.0
16
+
17
+ # OCR & Text Recognition
18
+ easyocr==1.7.1
19
+
20
+ # Transformers & AI Models
21
+ transformers==4.44.2
22
+
23
+ # RF-DETR Detection Model
24
+ rfdetr==1.0.4
25
+
26
+ # API Framework
27
+ fastapi==0.115.0
28
+ uvicorn[standard]==0.30.6
29
+
30
+ # HTTP Clients
31
+ requests==2.32.3
32
+ aiohttp==3.10.5
33
+
34
+ # Testing
35
+ pytest==8.3.3
36
+
37
+ # Utilities
38
+ python-multipart==0.0.9 # For FastAPI file uploads
39
+ python-dotenv==1.0.1 # For environment variables
40
+