Smilesjs commited on
Commit
cd9c68b
·
verified ·
1 Parent(s): c0b476e

Upload folder using huggingface_hub

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. README.md +2 -0
  2. README_IMAGE_ANALYSIS.md +180 -0
  3. aaa.ipynb +608 -0
  4. check.ipynb +219 -0
  5. hug.ipynb +79 -0
  6. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_04b3c20d-b045-492a-b70e-193d5f69c01f.jpg +3 -0
  7. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_07e716b7-fb4d-46be-bd23-483f72f22573.jpg +3 -0
  8. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_0cb7855f-7a57-4558-ab62-98d3243f2e30.jpg +3 -0
  9. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_0dc7f4b6-d55f-437b-a8e3-b0e73e80fefd.jpg +3 -0
  10. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_0f3bb084-ccc2-45e3-a953-e30206e8f33f.jpg +3 -0
  11. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_104ea969-f60a-49fa-88a8-aae84291221d.jpg +3 -0
  12. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_10a57085-8e77-40b2-bcc6-25c32c5d604e.jpg +3 -0
  13. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_11958e63-89e3-4015-9b6a-2d959488db5a.jpg +3 -0
  14. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1228a5f6-a305-4e84-a2a8-f481fbc8f729.jpg +3 -0
  15. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_122d139e-bb48-48a5-bddc-64e2f60f3d8e.jpg +3 -0
  16. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1268830b-2f2b-409b-b596-de2893ea6e2c.jpg +3 -0
  17. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_141731d0-e86d-4a28-882a-e73ff48b5f9b.jpg +3 -0
  18. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_179f0cd1-5ccd-4995-9aa2-04dc8fef3768.jpg +3 -0
  19. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1a1436ab-1453-4616-9290-e97bde2b7614.jpg +3 -0
  20. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1a33e376-6c98-41cf-8216-86bca823df51.jpg +3 -0
  21. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1f485d91-7ea8-492c-8d96-45f9dfd2369a.jpg +3 -0
  22. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1f6a7ca3-18c7-4b5a-ae9b-c1b04be3cb2b.jpg +3 -0
  23. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_20528fb0-f505-4dd1-b2e3-d95eabfdf46a.jpg +3 -0
  24. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2224a3d5-9acf-4f54-b691-72bc4ed8d419.jpg +3 -0
  25. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_232ca722-a3ea-4585-b059-25e272c5d580.jpg +3 -0
  26. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_278eb6dc-6219-4ab6-b146-3ac9a3b6a6ae.jpg +3 -0
  27. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2ab7df51-f68a-4084-8169-7cc7e32996e7.jpg +3 -0
  28. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2b3c3f24-47d5-4a82-ace9-eb1613a01988.jpg +3 -0
  29. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2b6213ca-731e-4593-a3a9-74071b9d8489.jpg +3 -0
  30. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2cf13e6c-9fc0-4e5f-99d5-b45c7f4f81d5.jpg +3 -0
  31. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2d2a5a92-6e52-4583-9c94-cc0248eae8a0.jpg +3 -0
  32. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2dc0b1fd-3d9b-47e4-8ba0-ba6c07fe3b0b.jpg +3 -0
  33. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2ef3e571-0064-408e-94d8-f4812c4ccd1c.jpg +3 -0
  34. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_30cca560-b470-4c7a-97b7-ffeea1e1a9f2.jpg +3 -0
  35. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_35cbedc9-e9d9-4aa1-9854-a37c4f5059d2.jpg +3 -0
  36. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_3b08aea7-e16b-435c-a371-3014eaf3cce2.jpg +3 -0
  37. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_3e1a129e-c8ca-4d0f-a13e-23b9f6cf7a83.jpg +3 -0
  38. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_40a591be-7cfa-49ae-a0cf-d98db2024a20.jpg +3 -0
  39. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4104b4e8-ad56-4155-a659-a157f9feb86f.jpg +3 -0
  40. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_411335c9-2e46-4a31-814a-4034d145424c.jpg +3 -0
  41. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_44ae542c-497b-4d98-9884-f54a9a6f6488.jpg +3 -0
  42. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_44bcd3cf-989e-477d-b127-9d4a417d190e.jpg +3 -0
  43. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4d7d37d6-b68e-47da-a2a9-8b41f55d9c06.jpg +3 -0
  44. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4e337333-1e63-4c4d-9b4a-959e6a4646c2.jpg +3 -0
  45. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4fa380b4-88bc-48d1-a7ad-878843694a79.jpg +3 -0
  46. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_50344dd3-f5eb-4dc4-bdd1-191296b4855e.jpg +3 -0
  47. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_509db6ca-4b04-4d4c-b6e3-ee6b7a1545b1.jpg +3 -0
  48. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_519f343c-b503-49d7-8c01-579684ad01cd.jpg +3 -0
  49. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_528976d4-813b-4f9f-9010-6718471cb6ce.jpg +3 -0
  50. image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_55a8909e-9f90-4d1b-abc5-88a927b30f3d.jpg +3 -0
README.md ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ # Haptix_image_dataset
2
+ 물리적 정보와 감각정 정보를 연결하고 라벨링한 데이터셋
README_IMAGE_ANALYSIS.md ADDED
@@ -0,0 +1,180 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 이미지 데이터셋 상태 분석 시스템
2
+
3
+ ## 개요
4
+ 이 노트북 파일(`iamge_status.ipynb`)은 Haptix 이미지 데이터셋의 라벨 분포를 자동으로 분석하고 시각화하는 종합 관리 시스템입니다.
5
+
6
+ ## 주요 기능
7
+
8
+ ### 1. **자동 라벨 인식**
9
+ - `_`(언더스코어)로 구분된 라벨을 자동으로 추출
10
+ - 새로운 폴더 추가 시 자동으로 인식
11
+ - 폴더명 형식: `LABEL1_LABEL2_LABEL3...`
12
+
13
+ ### 2. **다중 소스 지원**
14
+ - EmoSet_images
15
+ - Midjourney_images
16
+ - unsplash_images
17
+ - 새로운 소스 폴더 추가 시 자동 처리
18
+
19
+ ### 3. **생성 파일**
20
+
21
+ #### 📊 시각화 파일
22
+ - **`image_distribution_overview.png`**: 전체 라벨 분포 종합 분석
23
+ - 전체 라벨별 이미지 개수 (막대 그래프)
24
+ - 데이터 소스별 이미지 개수 비교
25
+ - 라벨 분포 비율 (파이 차트)
26
+ - 통계 정보 표시
27
+
28
+ - **`image_distribution_by_source.png`**: 소스별 상세 분석
29
+ - EmoSet_images: 라벨별 분포
30
+ - Midjourney_images: 라벨별 분포
31
+ - unsplash_images: 라벨별 분포
32
+
33
+ #### 📋 데이터 파일
34
+ - **`image_label_distribution.csv`**: 라벨별 이미지 개수 (Excel/Google Sheets에서 열 수 있음)
35
+ - 전체 라벨 목록
36
+ - 각 소스별 이미지 개수
37
+ - Excel 등에서 추가 분석 가능
38
+
39
+ ## 분석 결과 요약
40
+
41
+ ### 📈 현재 상태 (2025-11-14)
42
+
43
+ | 항목 | 값 |
44
+ |------|-----|
45
+ | **총 라벨 종류** | 22개 |
46
+ | **총 이미지 개수** | 14,070개 |
47
+ | **라벨당 평균** | 636.4개 |
48
+ | **라벨당 최대값** | 1,575개 (P5Static) |
49
+ | **라벨당 최소값** | 2개 (E5St, messy) |
50
+ | **표준편차** | 470.0개 |
51
+
52
+ ### ⚠️ 주의 - 불균형 라벨
53
+
54
+ 평균의 **50% 이상 부족**한 라벨:
55
+ - `P1Soft`: 293개 (평균의 46.0%)
56
+ - `P1Hard`: 243개 (평균의 38.2%)
57
+ - `energy`: 74개 (평균의 11.6%)
58
+ - `pleasure`: 55개 (평균의 8.6%)
59
+ - `unpleasnat`: 14개 (평균의 2.2%)
60
+ - `E5St`: 2개 (평균의 0.3%)
61
+ - `messy`: 2개 (평균의 0.3%)
62
+
63
+ ### 💡 권장사항
64
+
65
+ **균형도 지표: 787.5x (최대값/최소값)**
66
+ - ⚠️ **심각한 불균형이 존재합니다**
67
+
68
+ #### 해결 방안:
69
+ 1. **데이터 보강 (Data Augmentation)**
70
+ - 부족한 라벨에 대해 이미지 회전, 확대/축소, 색상 변환 등 적용
71
+ - 특히 `E5St`, `messy`, `unpleasnat`, `pleasure`, `energy` 라벨 우선
72
+
73
+ 2. **모델 학습 시 클래스 가중치 조정**
74
+ ```python
75
+ class_weights = {
76
+ 'P5Static': 1.0, # 충분함
77
+ 'E3N-Chaotic': 1.0,
78
+ # ...
79
+ 'E5St': 318.75, # 매우 부족 (636.4 / 2)
80
+ 'messy': 318.75,
81
+ }
82
+ ```
83
+
84
+ 3. **샘플링 전략**
85
+ - 언더샘플링(undersampling): 많은 라벨의 데이터 일부만 사용
86
+ - 오버샘플링(oversampling): 부족한 라벨의 데이터 반복 사용
87
+
88
+ ## 사용 방법
89
+
90
+ ### 1. 노트북 실행
91
+ ```bash
92
+ # VS Code에서 Jupyter Notebook 커널 선택 후
93
+ # 모든 셀을 순서대로 실행
94
+ Ctrl + Shift + Enter (현재 셀)
95
+ 또는
96
+ 셀 > 모두 실행 (Run All)
97
+ ```
98
+
99
+ ### 2. 새 이미지 소스 추가
100
+ ```bash
101
+ # image 폴더에 새 폴더 추가 (예: new_source/)
102
+ c:\Users\EL081\Desktop\backup\Haptix_image_dataset\image\new_source\
103
+ ├── LABEL1_LABEL2\
104
+ │ ├── image1.jpg
105
+ │ └── image2.png
106
+ └── LABEL3_LABEL4\
107
+ └── image3.jpg
108
+
109
+ # 노트북을 다시 실행하면 자동으로 분석됨
110
+ ```
111
+
112
+ ### 3. 결과 해석
113
+ - **빨간색 막대**: 평균 이하의 라벨 (주의 필요)
114
+ - **파란색 막대**: 평균 이상의 라벨 (충분함)
115
+ - **초록색 점선**: 평균값
116
+
117
+ ## 기술 사항
118
+
119
+ ### 지원 이미지 형식
120
+ - JPG, JPEG, PNG, GIF, BMP, WEBP
121
+
122
+ ### 폴더 구조 자동 처리
123
+ - 라벨 폴더 내 하위 폴더도 재귀적으로 검색
124
+ - 파일명 무시 (라벨은 폴더명에서만 추출)
125
+
126
+ ### 확장성
127
+ - 새 소스 추가 시 자동 인식
128
+ - 새 라벨 추가 시 자동 처리
129
+ - 코드 수정 불필요
130
+
131
+ ## 주의사항
132
+
133
+ 1. **파일명 규칙**
134
+ - 이미지가 포함된 폴더명은 반드시 `_`로 라벨을 구분해야 함
135
+ - 예: ✅ `P1Hard_E1N-Risky` / ❌ `P1Hard E1N-Risky`
136
+
137
+ 2. **이미지 개수**
138
+ - 폴더 내 모든 파일을 카운팅
139
+ - 비이미지 파일(텍스트, 문서)은 무시됨
140
+
141
+ 3. **성능**
142
+ - 수천 개 이미지 분석 시 약 15-20초 소요
143
+ - 처음 로드 시간이 가장 오래 걸림
144
+
145
+ ## 문제 해결
146
+
147
+ ### Q: 새 데이터를 추가했는데 반영이 안 됨
148
+ A: 노트북의 **모든 셀을 다시 실행**해주세요. (`Ctrl + Shift + Enter`)
149
+
150
+ ### Q: CSV 파일이 Excel에서 한글이 깨짐
151
+ A: CSV 파일을 메모장으로 열어 `UTF-8 BOM` 인코딩으로 저장 후 Excel에서 열기
152
+
153
+ ### Q: 특정 라벨만 분석하고 싶음
154
+ A: 노트북의 첫 번째 셀에서 `base_path`를 수정하면 됨
155
+
156
+ ## 파일 목록
157
+
158
+ ```
159
+ backup/
160
+ ├── iamge_status.ipynb ← 메인 분석 노트북
161
+ ├── image_distribution_overview.png ← 전체 분석 그래프
162
+ ├── image_distribution_by_source.png ← 소스별 분석 그래프
163
+ ├── image_label_distribution.csv ← 라벨별 통계 데이터
164
+ ├── README_IMAGE_ANALYSIS.md ← 이 파일
165
+ └── Haptix_image_dataset/
166
+ ├── image/
167
+ │ ├── EmoSet_images/ (3,143개)
168
+ │ ├── Midjourney_images/ (17개)
169
+ │ └── unsplash_images/ (1,910개)
170
+ └── ...
171
+ ```
172
+
173
+ ## 라이선스 및 지원
174
+
175
+ 작성 날짜: 2025-11-14
176
+ 최종 업데이트: 2025-11-14
177
+
178
+ ---
179
+
180
+ **질문이나 개선 사항이 있으면 노트북의 코드를 수정하거나 새로운 분석 셀을 추가할 수 있습니다.**
aaa.ipynb ADDED
@@ -0,0 +1,608 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "markdown",
5
+ "id": "ac1c7423",
6
+ "metadata": {},
7
+ "source": [
8
+ "# Upload Image Dataset to Hugging Face (Preserve Folders)\n",
9
+ "\n",
10
+ "This minimal workflow uploads your local folder to a Hugging Face dataset repo, keeping the directory structure intact."
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "code",
15
+ "execution_count": 12,
16
+ "id": "277a6040",
17
+ "metadata": {},
18
+ "outputs": [
19
+ {
20
+ "name": "stdout",
21
+ "output_type": "stream",
22
+ "text": [
23
+ "Ready: huggingface_hub installed and imports ok.\n"
24
+ ]
25
+ }
26
+ ],
27
+ "source": [
28
+ "# Install and import\n",
29
+ "import sys, subprocess, os\n",
30
+ "subprocess.check_call([sys.executable, \"-m\", \"pip\", \"install\", \"-q\", \"huggingface_hub\", \"hf_transfer\"])\n",
31
+ "from huggingface_hub import HfApi\n",
32
+ "os.environ[\"HF_HUB_ENABLE_HF_TRANSFER\"] = \"1\"\n",
33
+ "print(\"Ready: huggingface_hub installed and imports ok.\")"
34
+ ]
35
+ },
36
+ {
37
+ "cell_type": "code",
38
+ "execution_count": 13,
39
+ "id": "32833fb5",
40
+ "metadata": {},
41
+ "outputs": [
42
+ {
43
+ "name": "stdout",
44
+ "output_type": "stream",
45
+ "text": [
46
+ "Token OK for: Smilesjs\n"
47
+ ]
48
+ }
49
+ ],
50
+ "source": [
51
+ "# Set or prompt for HF token (safe)\n",
52
+ "import os\n",
53
+ "from getpass import getpass\n",
54
+ "from huggingface_hub import HfApi\n",
55
+ "\n",
56
+ "if not os.getenv(\"HF_TOKEN\"):\n",
57
+ " token = getpass(\"Enter your HF_TOKEN (input hidden): \")\n",
58
+ " os.environ[\"HF_TOKEN\"] = token\n",
59
+ "\n",
60
+ "# Optional: validate token early\n",
61
+ "try:\n",
62
+ " _api = HfApi(token=os.environ[\"HF_TOKEN\"])\n",
63
+ " who = _api.whoami()\n",
64
+ " owner = who.get(\"name\") or who.get(\"email\") or who.get(\"id\")\n",
65
+ " print(f\"Token OK for: {owner}\")\n",
66
+ "except Exception as e:\n",
67
+ " raise RuntimeError(\"HF token seems invalid or network issue. Double-check the token.\") from e"
68
+ ]
69
+ },
70
+ {
71
+ "cell_type": "code",
72
+ "execution_count": 20,
73
+ "id": "fe562d45",
74
+ "metadata": {},
75
+ "outputs": [
76
+ {
77
+ "name": "stdout",
78
+ "output_type": "stream",
79
+ "text": [
80
+ "Repo ready: https://huggingface.co/datasets/Smilesjs/Haptix_dataset\n"
81
+ ]
82
+ }
83
+ ],
84
+ "source": [
85
+ "# Configure local path and repo\n",
86
+ "from pathlib import Path\n",
87
+ "import os\n",
88
+ "from huggingface_hub import HfApi\n",
89
+ "\n",
90
+ "LOCAL_FOLDER = Path(r\"c:\\\\Users\\\\EL081\\\\Desktop\\\\local_backup\\\\image\").resolve()\n",
91
+ "REPO_ID = \"Smilesjs/Haptix_dataset\" # change if needed\n",
92
+ "REPO_TYPE = \"dataset\"\n",
93
+ "PRIVATE = True\n",
94
+ "\n",
95
+ "HF_TOKEN = os.getenv(\"HF_TOKEN\")\n",
96
+ "if not HF_TOKEN:\n",
97
+ " raise ValueError(\"HF_TOKEN is not set. In PowerShell: $env:HF_TOKEN='hf_...'\")\n",
98
+ "\n",
99
+ "if not LOCAL_FOLDER.exists():\n",
100
+ " raise FileNotFoundError(f\"Local folder not found: {LOCAL_FOLDER}\")\n",
101
+ "\n",
102
+ "api = HfApi(token=HF_TOKEN)\n",
103
+ "api.create_repo(repo_id=REPO_ID, repo_type=REPO_TYPE, private=PRIVATE, exist_ok=True)\n",
104
+ "print(f\"Repo ready: https://huggingface.co/datasets/{REPO_ID}\")"
105
+ ]
106
+ },
107
+ {
108
+ "cell_type": "code",
109
+ "execution_count": 34,
110
+ "id": "9da1f410",
111
+ "metadata": {},
112
+ "outputs": [
113
+ {
114
+ "name": "stdout",
115
+ "output_type": "stream",
116
+ "text": [
117
+ "Starting resumable upload to Smilesjs/Haptix_dataset with upload_large_folder... (this can take time)\n",
118
+ "Local: C:\\Users\\EL081\\Desktop\\local_backup\\image\n"
119
+ ]
120
+ },
121
+ {
122
+ "data": {
123
+ "application/vnd.jupyter.widget-view+json": {
124
+ "model_id": "2d2dc4acd4ed4ffb8d0ad51d3efda3f9",
125
+ "version_major": 2,
126
+ "version_minor": 0
127
+ },
128
+ "text/plain": [
129
+ "Recovering from metadata files: 0%| | 0/6920 [00:00<?, ?it/s]"
130
+ ]
131
+ },
132
+ "metadata": {},
133
+ "output_type": "display_data"
134
+ },
135
+ {
136
+ "name": "stdout",
137
+ "output_type": "stream",
138
+ "text": [
139
+ "\n",
140
+ "\n",
141
+ "\n",
142
+ "---------- 2025-11-14 19:10:53 (0:00:00) ----------\n",
143
+ "Files: hashed 6920/6920 (1.2G/1.2G) | pre-uploaded: 6918/6918 (1.2G/1.2G) | committed: 6920/6920 (1.2G/1.2G) | ignored: 0\n",
144
+ "Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 0 | committing: 0 | waiting: 0\n",
145
+ "---------------------------------------------------\n",
146
+ "Upload complete: None\n",
147
+ "Upload complete: None\n"
148
+ ]
149
+ }
150
+ ],
151
+ "source": [
152
+ "# Upload large folder preserving structure (resumable)\n",
153
+ "ignore_patterns = [\n",
154
+ " \"**/.ipynb_checkpoints/**\",\n",
155
+ " \"**/__pycache__/**\",\n",
156
+ " \"**/*.tmp\",\n",
157
+ " \"**/*.db\",\n",
158
+ " \"**/Thumbs.db\",\n",
159
+ " \"**/.DS_Store\",\n",
160
+ "]\n",
161
+ "\n",
162
+ "print(f\"Starting resumable upload to {REPO_ID} with upload_large_folder... (this can take time)\")\n",
163
+ "print(f\"Local: {LOCAL_FOLDER}\")\n",
164
+ "op = api.upload_large_folder(\n",
165
+ " repo_id=REPO_ID,\n",
166
+ " repo_type=REPO_TYPE,\n",
167
+ " folder_path=str(LOCAL_FOLDER),\n",
168
+ " # path_in_repo can be set to 'image' if you want a top-level 'image/' folder\n",
169
+ " # path_in_repo=\"\",\n",
170
+ " ignore_patterns=ignore_patterns,\n",
171
+ " # allow_duplicates=False, # uncomment to avoid duplicate files in future runs\n",
172
+ " # commit_message=\"Upload dataset (resumable)\"\n",
173
+ ")\n",
174
+ "print(\"Upload complete:\", op)"
175
+ ]
176
+ },
177
+ {
178
+ "cell_type": "code",
179
+ "execution_count": 35,
180
+ "id": "21b788d2",
181
+ "metadata": {},
182
+ "outputs": [
183
+ {
184
+ "name": "stdout",
185
+ "output_type": "stream",
186
+ "text": [
187
+ "Scanning repo: https://huggingface.co/datasets/Smilesjs/Haptix_dataset\n",
188
+ "Cache invalidated for HfFileSystem.\n",
189
+ "Cache invalidated for HfFileSystem.\n",
190
+ "Found 3 candidate paths in 0.64s\n",
191
+ "API list_repo_files reports: 3 files\n",
192
+ "Fetched metadata for 3 files (0 errors) in 0.00s\n",
193
+ "\n",
194
+ "=== Summary ===\n",
195
+ "Total files: 3\n",
196
+ "Total size : 63.74 KB\n",
197
+ "\n",
198
+ "Top extensions by size:\n",
199
+ " .png: 1 files, 61.33 KB\n",
200
+ " (no ext): 1 files, 2.40 KB\n",
201
+ " .txt: 1 files, 5.00 B\n",
202
+ "\n",
203
+ "Top-level folder counts:\n",
204
+ " datasets: 3 files\n",
205
+ "\n",
206
+ "Top 20 largest files:\n",
207
+ " 61.33 KB | datasets/Smilesjs/Haptix_dataset/augmented_images/E3P-Harmonic_P3Rough_P2Cold_P5Static/E3P+P3R+P2Co+P5St_aug_001.png\n",
208
+ " 2.40 KB | datasets/Smilesjs/Haptix_dataset/.gitattributes\n",
209
+ " 5.00 B | datasets/Smilesjs/Haptix_dataset/test_probe.txt\n",
210
+ "Found 3 candidate paths in 0.64s\n",
211
+ "API list_repo_files reports: 3 files\n",
212
+ "Fetched metadata for 3 files (0 errors) in 0.00s\n",
213
+ "\n",
214
+ "=== Summary ===\n",
215
+ "Total files: 3\n",
216
+ "Total size : 63.74 KB\n",
217
+ "\n",
218
+ "Top extensions by size:\n",
219
+ " .png: 1 files, 61.33 KB\n",
220
+ " (no ext): 1 files, 2.40 KB\n",
221
+ " .txt: 1 files, 5.00 B\n",
222
+ "\n",
223
+ "Top-level folder counts:\n",
224
+ " datasets: 3 files\n",
225
+ "\n",
226
+ "Top 20 largest files:\n",
227
+ " 61.33 KB | datasets/Smilesjs/Haptix_dataset/augmented_images/E3P-Harmonic_P3Rough_P2Cold_P5Static/E3P+P3R+P2Co+P5St_aug_001.png\n",
228
+ " 2.40 KB | datasets/Smilesjs/Haptix_dataset/.gitattributes\n",
229
+ " 5.00 B | datasets/Smilesjs/Haptix_dataset/test_probe.txt\n"
230
+ ]
231
+ }
232
+ ],
233
+ "source": [
234
+ "# Analyze remote repo file status (counts, sizes, top files)\n",
235
+ "from huggingface_hub import HfFileSystem\n",
236
+ "from concurrent.futures import ThreadPoolExecutor, as_completed\n",
237
+ "from collections import Counter, defaultdict\n",
238
+ "from pathlib import Path\n",
239
+ "import os, math, time\n",
240
+ "\n",
241
+ "if 'REPO_ID' not in globals():\n",
242
+ " raise RuntimeError(\"REPO_ID is not defined. Run the setup cells first.\")\n",
243
+ "if 'HF_TOKEN' not in globals() or not HF_TOKEN:\n",
244
+ " raise RuntimeError(\"HF_TOKEN is not available. Run the token cell first.\")\n",
245
+ "\n",
246
+ "def human_bytes(n: int) -> str:\n",
247
+ " if n is None:\n",
248
+ " return \"?\"\n",
249
+ " units = ['B','KB','MB','GB','TB']\n",
250
+ " i = 0\n",
251
+ " f = float(n)\n",
252
+ " while f >= 1024 and i < len(units)-1:\n",
253
+ " f /= 1024.0\n",
254
+ " i += 1\n",
255
+ " return f\"{f:.2f} {units[i]}\"\n",
256
+ "\n",
257
+ "fs = HfFileSystem(token=HF_TOKEN)\n",
258
+ "repo_url = f\"https://huggingface.co/datasets/{REPO_ID}\"\n",
259
+ "root = f\"hf://datasets/{REPO_ID}\"\n",
260
+ "print(f\"Scanning repo: {repo_url}\")\n",
261
+ "\n",
262
+ "# Try to invalidate any local cache to avoid stale listings\n",
263
+ "if hasattr(fs, \"invalidate_cache\"):\n",
264
+ " try:\n",
265
+ " fs.invalidate_cache()\n",
266
+ " # Also try root-specific invalidation if supported\n",
267
+ " try:\n",
268
+ " fs.invalidate_cache(root)\n",
269
+ " except Exception:\n",
270
+ " pass\n",
271
+ " print(\"Cache invalidated for HfFileSystem.\")\n",
272
+ " except Exception as _e:\n",
273
+ " print(\"Cache invalidation skipped:\", type(_e).__name__)\n",
274
+ "\n",
275
+ "# List all files recursively\n",
276
+ "t0 = time.time()\n",
277
+ "paths = fs.find(root) # returns file paths recursively\n",
278
+ "if not isinstance(paths, list):\n",
279
+ " paths = list(paths)\n",
280
+ "elapsed = time.time() - t0\n",
281
+ "print(f\"Found {len(paths):,} candidate paths in {elapsed:.2f}s\")\n",
282
+ "\n",
283
+ "# Fallback cross-check with API listing to detect propagation delays\n",
284
+ "try:\n",
285
+ " api_files = api.list_repo_files(repo_id=REPO_ID, repo_type=REPO_TYPE)\n",
286
+ " print(f\"API list_repo_files reports: {len(api_files):,} files\")\n",
287
+ " if len(paths) < len(api_files):\n",
288
+ " print(\"Note: HfFileSystem may be stale; UI/CDN can lag for ~1-2 minutes.\")\n",
289
+ "except Exception as _e:\n",
290
+ " print(\"API list_repo_files failed:\", type(_e).__name__)\n",
291
+ "\n",
292
+ "# Fetch sizes concurrently\n",
293
+ "def _info(pth: str):\n",
294
+ " try:\n",
295
+ " info = fs.info(pth)\n",
296
+ " if info.get('type') == 'file':\n",
297
+ " return pth, int(info.get('size') or 0)\n",
298
+ " except Exception:\n",
299
+ " return pth, None\n",
300
+ " return pth, None\n",
301
+ "\n",
302
+ "sizes = {}\n",
303
+ "errors = 0\n",
304
+ "t0 = time.time()\n",
305
+ "with ThreadPoolExecutor(max_workers=24) as ex:\n",
306
+ " futs = [ex.submit(_info, p) for p in paths]\n",
307
+ " for fut in as_completed(futs):\n",
308
+ " p, sz = fut.result()\n",
309
+ " if sz is None:\n",
310
+ " errors += 1\n",
311
+ " else:\n",
312
+ " sizes[p] = sz\n",
313
+ "elapsed = time.time() - t0\n",
314
+ "print(f\"Fetched metadata for {len(sizes):,} files ({errors} errors) in {elapsed:.2f}s\")\n",
315
+ "\n",
316
+ "# Summaries\n",
317
+ "total_files = len(sizes)\n",
318
+ "total_bytes = sum(sizes.values())\n",
319
+ "by_ext_count = Counter()\n",
320
+ "by_ext_bytes = defaultdict(int)\n",
321
+ "for p, sz in sizes.items():\n",
322
+ " ext = os.path.splitext(p)[1].lower() or \"\"\n",
323
+ " by_ext_count[ext] += 1\n",
324
+ " by_ext_bytes[ext] += sz\n",
325
+ "\n",
326
+ "print(\"\\n=== Summary ===\")\n",
327
+ "print(f\"Total files: {total_files:,}\")\n",
328
+ "print(f\"Total size : {human_bytes(total_bytes)}\")\n",
329
+ "\n",
330
+ "# Top extensions by bytes\n",
331
+ "top_ext = sorted(by_ext_bytes.items(), key=lambda x: x[1], reverse=True)[:10]\n",
332
+ "print(\"\\nTop extensions by size:\")\n",
333
+ "for ext, b in top_ext:\n",
334
+ " print(f\" {ext or '(no ext)'}: {by_ext_count[ext]:,} files, {human_bytes(b)}\")\n",
335
+ "\n",
336
+ "# Top-level folder distribution\n",
337
+ "top_level = Counter()\n",
338
+ "for p in sizes.keys():\n",
339
+ " rel = p.replace(root.rstrip('/'), '').lstrip('/')\n",
340
+ " parts = rel.split('/')\n",
341
+ " top = parts[0] if parts and parts[0] else '/'\n",
342
+ " top_level[top] += 1\n",
343
+ "print(\"\\nTop-level folder counts:\")\n",
344
+ "for name, cnt in top_level.most_common(15):\n",
345
+ " print(f\" {name}: {cnt:,} files\")\n",
346
+ "\n",
347
+ "# Largest files\n",
348
+ "top_k = 20\n",
349
+ "largest = sorted(sizes.items(), key=lambda x: x[1], reverse=True)[:top_k]\n",
350
+ "print(f\"\\nTop {top_k} largest files:\")\n",
351
+ "for p, sz in largest:\n",
352
+ " rel = p.replace(root.rstrip('/'), '').lstrip('/')\n",
353
+ " print(f\" {human_bytes(sz)} | {rel}\")"
354
+ ]
355
+ },
356
+ {
357
+ "cell_type": "code",
358
+ "execution_count": 30,
359
+ "id": "beb35ad2",
360
+ "metadata": {},
361
+ "outputs": [
362
+ {
363
+ "name": "stdout",
364
+ "output_type": "stream",
365
+ "text": [
366
+ "Repo: Smilesjs/Haptix_dataset type: dataset\n",
367
+ "list_repo_files count: 3\n",
368
+ "First 10: ['.gitattributes', 'augmented_images/E3P-Harmonic_P3Rough_P2Cold_P5Static/E3P+P3R+P2Co+P5St_aug_001.png', 'test_probe.txt']\n"
369
+ ]
370
+ },
371
+ {
372
+ "name": "stderr",
373
+ "output_type": "stream",
374
+ "text": [
375
+ "No files have been modified since last commit. Skipping to prevent empty commit.\n",
376
+ "WARNING:huggingface_hub.hf_api:No files have been modified since last commit. Skipping to prevent empty commit.\n"
377
+ ]
378
+ },
379
+ {
380
+ "name": "stdout",
381
+ "output_type": "stream",
382
+ "text": [
383
+ "Uploaded marker: test_probe.txt\n",
384
+ "list_repo_files after: 3\n",
385
+ "Contains marker? True\n",
386
+ "\n",
387
+ "Recent commits:\n",
388
+ "- 8e57433 sample image upload\n",
389
+ "- 6a8d3d3 probe upload\n",
390
+ "- 16f395e initial commit\n"
391
+ ]
392
+ }
393
+ ],
394
+ "source": [
395
+ "# Probe: list files and upload a tiny marker file to confirm branch\n",
396
+ "from io import BytesIO\n",
397
+ "print(\"Repo:\", REPO_ID, \"type:\", REPO_TYPE)\n",
398
+ "files = api.list_repo_files(repo_id=REPO_ID, repo_type=REPO_TYPE)\n",
399
+ "print(\"list_repo_files count:\", len(files))\n",
400
+ "print(\"First 10:\", files[:10])\n",
401
+ "\n",
402
+ "marker_path = \"test_probe.txt\"\n",
403
+ "api.upload_file(path_or_fileobj=BytesIO(b\"probe\"), path_in_repo=marker_path, repo_id=REPO_ID, repo_type=REPO_TYPE, commit_message=\"probe upload\")\n",
404
+ "print(\"Uploaded marker:\", marker_path)\n",
405
+ "files2 = api.list_repo_files(repo_id=REPO_ID, repo_type=REPO_TYPE)\n",
406
+ "print(\"list_repo_files after:\", len(files2))\n",
407
+ "print(\"Contains marker?\", marker_path in files2)\n",
408
+ "\n",
409
+ "# Show latest commits\n",
410
+ "commits = api.list_repo_commits(repo_id=REPO_ID, repo_type=REPO_TYPE)\n",
411
+ "print(\"\\nRecent commits:\")\n",
412
+ "for c in commits[:5]:\n",
413
+ " print(\"-\", c.commit_id[:7], c.title)"
414
+ ]
415
+ },
416
+ {
417
+ "cell_type": "code",
418
+ "execution_count": 28,
419
+ "id": "dbc2c9de",
420
+ "metadata": {},
421
+ "outputs": [
422
+ {
423
+ "name": "stdout",
424
+ "output_type": "stream",
425
+ "text": [
426
+ "Uploading sample: C:\\Users\\EL081\\Desktop\\local_backup\\image\\augmented_images\\E3P-Harmonic_P3Rough_P2Cold_P5Static\\E3P+P3R+P2Co+P5St_aug_001.png\n"
427
+ ]
428
+ },
429
+ {
430
+ "data": {
431
+ "application/vnd.jupyter.widget-view+json": {
432
+ "model_id": "8b0ce6dca03e4dcfbd235e3e9c1573e7",
433
+ "version_major": 2,
434
+ "version_minor": 0
435
+ },
436
+ "text/plain": [
437
+ "Processing Files (0 / 0): | | 0.00B / 0.00B "
438
+ ]
439
+ },
440
+ "metadata": {},
441
+ "output_type": "display_data"
442
+ },
443
+ {
444
+ "data": {
445
+ "application/vnd.jupyter.widget-view+json": {
446
+ "model_id": "fb3e8bf2c1b5438aaf6334a4333e0c8a",
447
+ "version_major": 2,
448
+ "version_minor": 0
449
+ },
450
+ "text/plain": [
451
+ "New Data Upload: | | 0.00B / 0.00B "
452
+ ]
453
+ },
454
+ "metadata": {},
455
+ "output_type": "display_data"
456
+ },
457
+ {
458
+ "name": "stdout",
459
+ "output_type": "stream",
460
+ "text": [
461
+ "Uploaded to: augmented_images/E3P-Harmonic_P3Rough_P2Cold_P5Static/E3P+P3R+P2Co+P5St_aug_001.png\n"
462
+ ]
463
+ }
464
+ ],
465
+ "source": [
466
+ "# Probe: upload one image file to confirm dataset accepts content\n",
467
+ "import os\n",
468
+ "from pathlib import Path\n",
469
+ "from itertools import chain\n",
470
+ "exts = {\".jpg\",\".jpeg\",\".png\"}\n",
471
+ "sample_path = None\n",
472
+ "for root_dir, dirs, files in os.walk(LOCAL_FOLDER):\n",
473
+ " for fn in files:\n",
474
+ " if Path(fn).suffix.lower() in exts:\n",
475
+ " sample_path = Path(root_dir) / fn\n",
476
+ " break\n",
477
+ " if sample_path:\n",
478
+ " break\n",
479
+ "if not sample_path:\n",
480
+ " raise RuntimeError(\"No image file found under LOCAL_FOLDER.\")\n",
481
+ "rel = sample_path.relative_to(LOCAL_FOLDER).as_posix()\n",
482
+ "dest = rel # or f\"image/{rel}\" to nest under image/\n",
483
+ "print(\"Uploading sample:\", sample_path)\n",
484
+ "api.upload_file(path_or_fileobj=str(sample_path), path_in_repo=dest, repo_id=REPO_ID, repo_type=REPO_TYPE, commit_message=\"sample image upload\")\n",
485
+ "print(\"Uploaded to:\", dest)"
486
+ ]
487
+ },
488
+ {
489
+ "cell_type": "code",
490
+ "execution_count": 31,
491
+ "id": "5a5057d4",
492
+ "metadata": {},
493
+ "outputs": [
494
+ {
495
+ "name": "stdout",
496
+ "output_type": "stream",
497
+ "text": [
498
+ "No .hf_transfer metadata directories found under LOCAL_FOLDER.\n"
499
+ ]
500
+ }
501
+ ],
502
+ "source": [
503
+ "# Optional: clear hf_transfer cached metadata under LOCAL_FOLDER to avoid stale plans\n",
504
+ "import os, shutil\n",
505
+ "found = []\n",
506
+ "for root_dir, dirs, files in os.walk(LOCAL_FOLDER):\n",
507
+ " if \".hf_transfer\" in dirs:\n",
508
+ " found.append(os.path.join(root_dir, \".hf_transfer\"))\n",
509
+ "if found:\n",
510
+ " print(\"Found hf_transfer metadata dirs:\")\n",
511
+ " for d in found:\n",
512
+ " print(\" -\", d)\n",
513
+ " # Uncomment next lines to clear the cached metadata that may point to another repo\n",
514
+ " # for d in found:\n",
515
+ " # shutil.rmtree(d, ignore_errors=True)\n",
516
+ " # print(\"Cleared hf_transfer metadata.\")\n",
517
+ "else:\n",
518
+ " print(\"No .hf_transfer metadata directories found under LOCAL_FOLDER.\")"
519
+ ]
520
+ },
521
+ {
522
+ "cell_type": "code",
523
+ "execution_count": 32,
524
+ "id": "1668f9a0",
525
+ "metadata": {},
526
+ "outputs": [
527
+ {
528
+ "name": "stdout",
529
+ "output_type": "stream",
530
+ "text": [
531
+ "upload_large_folder signature: (repo_id: 'str', folder_path: 'Union[str, Path]', *, repo_type: 'str', revision: 'Optional[str]' = None, private: 'Optional[bool]' = None, allow_patterns: 'Optional[Union[list[str], str]]' = None, ignore_patterns: 'Optional[Union[list[str], str]]' = None, num_workers: 'Optional[int]' = None, print_report: 'bool' = True, print_report_every: 'int' = 60) -> 'None'\n",
532
+ "\n",
533
+ "Doc (first 40 lines):\n",
534
+ " Upload a large folder to the Hub in the most resilient way possible.\n",
535
+ "\n",
536
+ "Several workers are started to upload files in an optimized way. Before being committed to a repo, files must be\n",
537
+ "hashed and be pre-uploaded if they are LFS files. Workers will perform these tasks for each file in the folder.\n",
538
+ "At each step, some metadata information about the upload process is saved in the folder under `.cache/.huggingface/`\n",
539
+ "to be able to resume the process if interrupted. The whole process might result in several commits.\n",
540
+ "\n",
541
+ "Args:\n",
542
+ " repo_id (`str`):\n",
543
+ " The repository to which the file will be uploaded.\n",
544
+ " E.g. `\"HuggingFaceTB/smollm-corpus\"`.\n",
545
+ " folder_path (`str` or `Path`):\n",
546
+ " Path to the folder to upload on the local file system.\n",
547
+ " repo_type (`str`):\n",
548
+ " Type of the repository. Must be one of `\"model\"`, `\"dataset\"` or `\"space\"`.\n",
549
+ " Unlike in all other `HfApi` methods, `repo_type` is explicitly required here. This is to avoid\n",
550
+ " any mistake when uploading a large folder to the Hub, and therefore prevent from having to re-upload\n",
551
+ " everything.\n",
552
+ " revision (`str`, `optional`):\n",
553
+ " The branch to commit to. If not provided, the `main` branch will be used.\n",
554
+ " private (`bool`, `optional`):\n",
555
+ " Whether the repository should be private.\n",
556
+ " If `None` (default), the repo will be public unless the organization's default is private.\n",
557
+ " allow_patterns (`list[str]` or `str`, *optional*):\n",
558
+ " If provided, only files matching at least one pattern are uploaded.\n",
559
+ " ignore_patterns (`list[str]` or `str`, *optional*):\n",
560
+ " If provided, files matching any of the patterns are not uploaded.\n",
561
+ " num_workers (`int`, *optional*):\n",
562
+ " Number of workers to start. Defaults to `os.cpu_count() - 2` (minimum 2).\n",
563
+ " A higher number of workers may speed up the process if your machine allows it. However, on machines with a\n",
564
+ " slower connection, it is recommended to keep the number of workers low to ensure better resumability.\n",
565
+ " Indeed, partially uploaded files will have to be completely re-uploaded if the process is interrupted.\n",
566
+ " print_report (`bool`, *optional*):\n",
567
+ " Whether to print a report of the upload progress. Defaults to True.\n",
568
+ " Report is printed to `sys.stdout` every X seconds (60 by defaults) and overwrites the previous report.\n",
569
+ " print_report_every (`int`, *optional*):\n",
570
+ " Frequency at which the report is printed. Defaults to 60 seconds.\n",
571
+ "\n",
572
+ "> [!TIP]\n",
573
+ "> A few things to keep in mind:\n"
574
+ ]
575
+ }
576
+ ],
577
+ "source": [
578
+ "# Inspect upload_large_folder signature\n",
579
+ "import inspect, textwrap\n",
580
+ "sig = inspect.signature(api.upload_large_folder)\n",
581
+ "print(\"upload_large_folder signature:\", sig)\n",
582
+ "doc = inspect.getdoc(api.upload_large_folder)\n",
583
+ "print(\"\\nDoc (first 40 lines):\\n\", \"\\n\".join(doc.splitlines()[:40]))"
584
+ ]
585
+ }
586
+ ],
587
+ "metadata": {
588
+ "kernelspec": {
589
+ "display_name": "Python 3",
590
+ "language": "python",
591
+ "name": "python3"
592
+ },
593
+ "language_info": {
594
+ "codemirror_mode": {
595
+ "name": "ipython",
596
+ "version": 3
597
+ },
598
+ "file_extension": ".py",
599
+ "mimetype": "text/x-python",
600
+ "name": "python",
601
+ "nbconvert_exporter": "python",
602
+ "pygments_lexer": "ipython3",
603
+ "version": "3.11.8"
604
+ }
605
+ },
606
+ "nbformat": 4,
607
+ "nbformat_minor": 5
608
+ }
check.ipynb ADDED
@@ -0,0 +1,219 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "markdown",
5
+ "id": "afc7ef8c",
6
+ "metadata": {},
7
+ "source": [
8
+ "# Image Folder Analysis\n",
9
+ "이 노트북은 `image` 폴더 내의 모든 이미지 파일 수를 분석합니다."
10
+ ]
11
+ },
12
+ {
13
+ "cell_type": "code",
14
+ "execution_count": 1,
15
+ "id": "7a5442aa",
16
+ "metadata": {},
17
+ "outputs": [
18
+ {
19
+ "name": "stdout",
20
+ "output_type": "stream",
21
+ "text": [
22
+ "============================================================\n",
23
+ "이미지 폴더 분석 결과\n",
24
+ "============================================================\n",
25
+ "\n",
26
+ "전체 파일 수: 13,841\n",
27
+ "이미지 파일 수: 6,918\n",
28
+ "기타 파일 수: 6,923\n",
29
+ "\n",
30
+ "============================================================\n",
31
+ "메인 폴더별 이미지 수\n",
32
+ "============================================================\n"
33
+ ]
34
+ }
35
+ ],
36
+ "source": [
37
+ "import os\n",
38
+ "import pandas as pd\n",
39
+ "from pathlib import Path\n",
40
+ "from collections import defaultdict\n",
41
+ "\n",
42
+ "# 이미지 폴더 경로\n",
43
+ "image_folder = r\"C:\\Users\\EL081\\Desktop\\local_backup\\image\"\n",
44
+ "\n",
45
+ "# 이미지 파일 확장자\n",
46
+ "image_extensions = {'.jpg', '.jpeg', '.png', '.gif', '.bmp', '.webp', '.tiff', '.tif', '.ico', '.svg'}\n",
47
+ "\n",
48
+ "# 전체 파일 수 계산\n",
49
+ "total_files = 0\n",
50
+ "total_images = 0\n",
51
+ "folder_stats = defaultdict(lambda: {'total': 0, 'images': 0})\n",
52
+ "\n",
53
+ "for root, dirs, files in os.walk(image_folder):\n",
54
+ " for file in files:\n",
55
+ " total_files += 1\n",
56
+ " ext = Path(file).suffix.lower()\n",
57
+ " if ext in image_extensions:\n",
58
+ " total_images += 1\n",
59
+ " \n",
60
+ " # 메인 폴더별 통계\n",
61
+ " relative_path = os.path.relpath(root, image_folder)\n",
62
+ " main_folder = relative_path.split(os.sep)[0]\n",
63
+ " folder_stats[main_folder]['total'] += 1\n",
64
+ " if ext in image_extensions:\n",
65
+ " folder_stats[main_folder]['images'] += 1\n",
66
+ "\n",
67
+ "print(\"=\" * 60)\n",
68
+ "print(\"이미지 폴더 분석 결과\")\n",
69
+ "print(\"=\" * 60)\n",
70
+ "print(f\"\\n전체 파일 수: {total_files:,}\")\n",
71
+ "print(f\"이미지 파일 수: {total_images:,}\")\n",
72
+ "print(f\"기타 파일 수: {total_files - total_images:,}\")\n",
73
+ "print(\"\\n\" + \"=\" * 60)\n",
74
+ "print(\"메인 폴더별 이미지 수\")\n",
75
+ "print(\"=\" * 60)"
76
+ ]
77
+ },
78
+ {
79
+ "cell_type": "code",
80
+ "execution_count": 2,
81
+ "id": "be675b07",
82
+ "metadata": {},
83
+ "outputs": [
84
+ {
85
+ "name": "stdout",
86
+ "output_type": "stream",
87
+ "text": [
88
+ ".cache 이미지: 0 전체: 6,921\n",
89
+ "EmoSet_images 이미지: 2,888 전체: 2,888\n",
90
+ "Midjourney_images 이미지: 17 전체: 17\n",
91
+ "augmented_images 이미지: 1,313 전체: 1,315\n",
92
+ "generated_images 이미지: 795 전체: 795\n",
93
+ "unsplash_images 이미지: 1,905 전체: 1,905\n",
94
+ "\n",
95
+ "============================================================\n",
96
+ "메인 폴더별 이미지 수 (상세)\n",
97
+ "============================================================\n",
98
+ " 폴더명 이미지수 전체파일수\n",
99
+ " EmoSet_images 2888 2888\n",
100
+ " unsplash_images 1905 1905\n",
101
+ " augmented_images 1313 1315\n",
102
+ " generated_images 795 795\n",
103
+ "Midjourney_images 17 17\n",
104
+ " .cache 0 6921\n"
105
+ ]
106
+ }
107
+ ],
108
+ "source": [
109
+ "# 메인 폴더별 상세 통계\n",
110
+ "for folder in sorted(folder_stats.keys()):\n",
111
+ " stats = folder_stats[folder]\n",
112
+ " print(f\"{folder:<40} 이미지: {stats['images']:>6,} 전체: {stats['total']:>6,}\")\n",
113
+ "\n",
114
+ "# 데이터프레임으로 정렬된 결과 표시\n",
115
+ "df_stats = pd.DataFrame([\n",
116
+ " {'폴더명': folder, '이미지수': stats['images'], '전체파일수': stats['total']}\n",
117
+ " for folder, stats in folder_stats.items()\n",
118
+ "]).sort_values('이미지수', ascending=False)\n",
119
+ "\n",
120
+ "print(\"\\n\" + \"=\" * 60)\n",
121
+ "print(\"메인 폴더별 이미지 수 (상세)\")\n",
122
+ "print(\"=\" * 60)\n",
123
+ "print(df_stats.to_string(index=False))"
124
+ ]
125
+ },
126
+ {
127
+ "cell_type": "code",
128
+ "execution_count": null,
129
+ "id": "84374e6a",
130
+ "metadata": {},
131
+ "outputs": [],
132
+ "source": [
133
+ "# 파일 타입별 분석\n",
134
+ "extension_count = defaultdict(int)\n",
135
+ "\n",
136
+ "for root, dirs, files in os.walk(image_folder):\n",
137
+ " for file in files:\n",
138
+ " ext = Path(file).suffix.lower()\n",
139
+ " if ext:\n",
140
+ " extension_count[ext] += 1\n",
141
+ "\n",
142
+ "print(\"\\n\" + \"=\" * 60)\n",
143
+ "print(\"파일 형식별 수\")\n",
144
+ "print(\"=\" * 60)\n",
145
+ "\n",
146
+ "df_ext = pd.DataFrame([\n",
147
+ " {'파일형식': ext, '수': count}\n",
148
+ " for ext, count in extension_count.items()\n",
149
+ "]).sort_values('수', ascending=False)\n",
150
+ "\n",
151
+ "print(df_ext.to_string(index=False))"
152
+ ]
153
+ },
154
+ {
155
+ "cell_type": "code",
156
+ "execution_count": null,
157
+ "id": "d37eae36",
158
+ "metadata": {},
159
+ "outputs": [],
160
+ "source": [
161
+ "# 시각화\n",
162
+ "import matplotlib.pyplot as plt\n",
163
+ "plt.figure(figsize=(12, 6))\n",
164
+ "\n",
165
+ "# 1. 메인 폴더별 이미지 수 차트\n",
166
+ "plt.subplot(1, 2, 1)\n",
167
+ "df_sorted = df_stats.sort_values('이미지수', ascending=True)\n",
168
+ "plt.barh(df_sorted['폴더명'], df_sorted['이미지수'], color='steelblue')\n",
169
+ "plt.xlabel('이미지 수')\n",
170
+ "plt.title('메인 폴더별 이미지 수')\n",
171
+ "plt.grid(axis='x', alpha=0.3)\n",
172
+ "\n",
173
+ "# 2. 상위 10개 파일 형식 차트\n",
174
+ "plt.subplot(1, 2, 2)\n",
175
+ "df_ext_top = df_ext.head(10)\n",
176
+ "plt.bar(df_ext_top['파일형식'], df_ext_top['수'], color='coral')\n",
177
+ "plt.xlabel('파일 형식')\n",
178
+ "plt.ylabel('수')\n",
179
+ "plt.title('상위 10개 파일 형식')\n",
180
+ "plt.xticks(rotation=45)\n",
181
+ "plt.grid(axis='y', alpha=0.3)\n",
182
+ "\n",
183
+ "plt.tight_layout()\n",
184
+ "plt.show()\n",
185
+ "\n",
186
+ "print(\"\\n✓ 분석 완료!\")"
187
+ ]
188
+ },
189
+ {
190
+ "cell_type": "code",
191
+ "execution_count": null,
192
+ "id": "b91d4a75",
193
+ "metadata": {},
194
+ "outputs": [],
195
+ "source": []
196
+ }
197
+ ],
198
+ "metadata": {
199
+ "kernelspec": {
200
+ "display_name": "Python 3",
201
+ "language": "python",
202
+ "name": "python3"
203
+ },
204
+ "language_info": {
205
+ "codemirror_mode": {
206
+ "name": "ipython",
207
+ "version": 3
208
+ },
209
+ "file_extension": ".py",
210
+ "mimetype": "text/x-python",
211
+ "name": "python",
212
+ "nbconvert_exporter": "python",
213
+ "pygments_lexer": "ipython3",
214
+ "version": "3.11.8"
215
+ }
216
+ },
217
+ "nbformat": 4,
218
+ "nbformat_minor": 5
219
+ }
hug.ipynb ADDED
@@ -0,0 +1,79 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 1,
6
+ "id": "ae64f9ce",
7
+ "metadata": {},
8
+ "outputs": [],
9
+ "source": [
10
+ "from huggingface_hub import HfApi"
11
+ ]
12
+ },
13
+ {
14
+ "cell_type": "code",
15
+ "execution_count": 3,
16
+ "id": "167062d2",
17
+ "metadata": {},
18
+ "outputs": [
19
+ {
20
+ "data": {
21
+ "application/vnd.jupyter.widget-view+json": {
22
+ "model_id": "2f64faaf886b47e483da57cac528c88f",
23
+ "version_major": 2,
24
+ "version_minor": 0
25
+ },
26
+ "text/plain": [
27
+ "Recovering from metadata files: 0%| | 0/6920 [00:00<?, ?it/s]"
28
+ ]
29
+ },
30
+ "metadata": {},
31
+ "output_type": "display_data"
32
+ },
33
+ {
34
+ "name": "stdout",
35
+ "output_type": "stream",
36
+ "text": [
37
+ "\n",
38
+ "\n",
39
+ "\n",
40
+ "---------- 2025-11-14 21:22:14 (0:00:00) ----------\n",
41
+ "Files: hashed 6920/6920 (1.2G/1.2G) | pre-uploaded: 6918/6918 (1.2G/1.2G) | committed: 6920/6920 (1.2G/1.2G) | ignored: 0\n",
42
+ "Workers: hashing: 0 | get upload mode: 0 | pre-uploading: 0 | committing: 0 | waiting: 0\n",
43
+ "---------------------------------------------------\n"
44
+ ]
45
+ }
46
+ ],
47
+ "source": [
48
+ "api = HfApi()\n",
49
+ "\n",
50
+ "HfApi().upload_large_folder(\n",
51
+ " folder_path=\"C:/Users/EL081/Desktop/local_backup/image\",\n",
52
+ " repo_id=\"Smilesjs/Haptix_dataset\",\n",
53
+ " repo_type=\"dataset\",\n",
54
+ ")"
55
+ ]
56
+ }
57
+ ],
58
+ "metadata": {
59
+ "kernelspec": {
60
+ "display_name": "Python 3",
61
+ "language": "python",
62
+ "name": "python3"
63
+ },
64
+ "language_info": {
65
+ "codemirror_mode": {
66
+ "name": "ipython",
67
+ "version": 3
68
+ },
69
+ "file_extension": ".py",
70
+ "mimetype": "text/x-python",
71
+ "name": "python",
72
+ "nbconvert_exporter": "python",
73
+ "pygments_lexer": "ipython3",
74
+ "version": "3.11.8"
75
+ }
76
+ },
77
+ "nbformat": 4,
78
+ "nbformat_minor": 5
79
+ }
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_04b3c20d-b045-492a-b70e-193d5f69c01f.jpg ADDED

Git LFS Details

  • SHA256: b4b30a2b3ecbb3eb456ab4ef974d82c8aeb6499ffe9197a4d642042aa516d398
  • Pointer size: 131 Bytes
  • Size of remote file: 128 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_07e716b7-fb4d-46be-bd23-483f72f22573.jpg ADDED

Git LFS Details

  • SHA256: a6ca8f916a7d5f922735304d1af0aeb085fadbad40f7b616bf077553237b1c7a
  • Pointer size: 130 Bytes
  • Size of remote file: 97 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_0cb7855f-7a57-4558-ab62-98d3243f2e30.jpg ADDED

Git LFS Details

  • SHA256: 5c366b8c6c0c60389ebc494d26fa6b416ae967bb6026f8132e70e2e4f09cc3f3
  • Pointer size: 131 Bytes
  • Size of remote file: 120 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_0dc7f4b6-d55f-437b-a8e3-b0e73e80fefd.jpg ADDED

Git LFS Details

  • SHA256: 4b2559282959a2c0e5c686d8d8d33bbe6b48cbf03c867e943bce4c53817d6345
  • Pointer size: 131 Bytes
  • Size of remote file: 185 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_0f3bb084-ccc2-45e3-a953-e30206e8f33f.jpg ADDED

Git LFS Details

  • SHA256: 2a84c10552ff0a244ce7480de2140a04d829278792726c573f324e77d5c8e4b4
  • Pointer size: 131 Bytes
  • Size of remote file: 103 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_104ea969-f60a-49fa-88a8-aae84291221d.jpg ADDED

Git LFS Details

  • SHA256: 3d211efb59cbecd8d07b7c578c73a06a03d8cd12fa37b017a3ca200c6e70990a
  • Pointer size: 130 Bytes
  • Size of remote file: 76.9 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_10a57085-8e77-40b2-bcc6-25c32c5d604e.jpg ADDED

Git LFS Details

  • SHA256: 4604abef502ebb8b18e4e90825aae932e736eac4991194d78d8f587eedf45a53
  • Pointer size: 131 Bytes
  • Size of remote file: 145 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_11958e63-89e3-4015-9b6a-2d959488db5a.jpg ADDED

Git LFS Details

  • SHA256: 6cb23990587378a86bee33201775ebd19a96693f9cbc17b837b76489d6d3b63b
  • Pointer size: 130 Bytes
  • Size of remote file: 35.4 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1228a5f6-a305-4e84-a2a8-f481fbc8f729.jpg ADDED

Git LFS Details

  • SHA256: d7e1daf2e31a0e9ca5c1d727d74a84cba44c44b12eb4ba27d938cb0cd3decb49
  • Pointer size: 130 Bytes
  • Size of remote file: 53.6 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_122d139e-bb48-48a5-bddc-64e2f60f3d8e.jpg ADDED

Git LFS Details

  • SHA256: 04a22e6af12904f9daade2be06e625a6e79bc4b5f1b95836d23e11bd81bbe835
  • Pointer size: 131 Bytes
  • Size of remote file: 157 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1268830b-2f2b-409b-b596-de2893ea6e2c.jpg ADDED

Git LFS Details

  • SHA256: cf72b7ad870f2d1daf4e409dba82e4ce3755b17cde6978904783a895a875920b
  • Pointer size: 131 Bytes
  • Size of remote file: 148 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_141731d0-e86d-4a28-882a-e73ff48b5f9b.jpg ADDED

Git LFS Details

  • SHA256: 2d96400a346d2f925c5e36a138e45913972640b5e8de46eee83a4f22e105a570
  • Pointer size: 130 Bytes
  • Size of remote file: 75.9 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_179f0cd1-5ccd-4995-9aa2-04dc8fef3768.jpg ADDED

Git LFS Details

  • SHA256: 2a53c29e72c0189a711a36dda3f0ecf796cfe89c1a48bad8f0c5bf02a14fc12c
  • Pointer size: 131 Bytes
  • Size of remote file: 106 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1a1436ab-1453-4616-9290-e97bde2b7614.jpg ADDED

Git LFS Details

  • SHA256: c6dc62543b014e21c254eb2eeb6e9c0f760e8c0541e0f6670748c61001958ec8
  • Pointer size: 130 Bytes
  • Size of remote file: 42.8 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1a33e376-6c98-41cf-8216-86bca823df51.jpg ADDED

Git LFS Details

  • SHA256: 59f05a99e37a727110763f5b9a91fe85cd8034198de4eb80e2e26b3912091538
  • Pointer size: 130 Bytes
  • Size of remote file: 57.3 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1f485d91-7ea8-492c-8d96-45f9dfd2369a.jpg ADDED

Git LFS Details

  • SHA256: 04acf5fef6db512fcda91cdb41a612daee8a75b05e041db1b6bd6125b6141640
  • Pointer size: 131 Bytes
  • Size of remote file: 106 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_1f6a7ca3-18c7-4b5a-ae9b-c1b04be3cb2b.jpg ADDED

Git LFS Details

  • SHA256: d6730a778b161d16fc02512b878fc5a3f1eda6ebfac18f2a2d1e649e640259ee
  • Pointer size: 130 Bytes
  • Size of remote file: 40.2 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_20528fb0-f505-4dd1-b2e3-d95eabfdf46a.jpg ADDED

Git LFS Details

  • SHA256: 2206039f682a1fd51a4451aec49a49e88834d684dded96db1d85df8a56a2c9f8
  • Pointer size: 130 Bytes
  • Size of remote file: 80.3 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2224a3d5-9acf-4f54-b691-72bc4ed8d419.jpg ADDED

Git LFS Details

  • SHA256: 4d3eb7ae50de0b5798b0b7dc1096f8307cebf6172457285a111e1c2e195af491
  • Pointer size: 130 Bytes
  • Size of remote file: 99.7 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_232ca722-a3ea-4585-b059-25e272c5d580.jpg ADDED

Git LFS Details

  • SHA256: 7e9da609f8226de7d65af17562016ab610ffff788ceefe025b535bcca108f1cd
  • Pointer size: 130 Bytes
  • Size of remote file: 24.4 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_278eb6dc-6219-4ab6-b146-3ac9a3b6a6ae.jpg ADDED

Git LFS Details

  • SHA256: 6c4e0e8102ad1cd96596037029ae562a3bf36712eef84bcc1134adc73aa64b5c
  • Pointer size: 130 Bytes
  • Size of remote file: 89 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2ab7df51-f68a-4084-8169-7cc7e32996e7.jpg ADDED

Git LFS Details

  • SHA256: 8e47ec514a0fee8e644a9623170a6e9fddae1963aecd7c4baeb88d824a5ebe2e
  • Pointer size: 130 Bytes
  • Size of remote file: 41 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2b3c3f24-47d5-4a82-ace9-eb1613a01988.jpg ADDED

Git LFS Details

  • SHA256: 8a351fddccd0f4872161d1ca5bd91840b751b5dc2181dd6d05b8d72e40ac3813
  • Pointer size: 131 Bytes
  • Size of remote file: 186 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2b6213ca-731e-4593-a3a9-74071b9d8489.jpg ADDED

Git LFS Details

  • SHA256: 1e33cf18d3ccd19f8b3ec14bad79286347e068e68d6953d19e5bd8610ef0cffd
  • Pointer size: 131 Bytes
  • Size of remote file: 129 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2cf13e6c-9fc0-4e5f-99d5-b45c7f4f81d5.jpg ADDED

Git LFS Details

  • SHA256: 163bd8e76be3e51ef49be886fde13542888ae310b8bd4f1dcb801e382d144f2b
  • Pointer size: 130 Bytes
  • Size of remote file: 99 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2d2a5a92-6e52-4583-9c94-cc0248eae8a0.jpg ADDED

Git LFS Details

  • SHA256: f19d0654b19495a85169a34bb6c4b7c6991d0b155895054bcb296763db6011bf
  • Pointer size: 131 Bytes
  • Size of remote file: 191 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2dc0b1fd-3d9b-47e4-8ba0-ba6c07fe3b0b.jpg ADDED

Git LFS Details

  • SHA256: 6180d659db541c56480ac33b2349d71f9d95cdf648a78606ce497bc37dd36bec
  • Pointer size: 130 Bytes
  • Size of remote file: 58.7 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_2ef3e571-0064-408e-94d8-f4812c4ccd1c.jpg ADDED

Git LFS Details

  • SHA256: 937c11e51982967ab2d88db4ccb7fcad239e50b24d2dbf83361d158fc76c930c
  • Pointer size: 130 Bytes
  • Size of remote file: 31.7 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_30cca560-b470-4c7a-97b7-ffeea1e1a9f2.jpg ADDED

Git LFS Details

  • SHA256: 8207549f201598e57825e24721d790d85e6305a474cc77ae246645aa8ca6ca3b
  • Pointer size: 130 Bytes
  • Size of remote file: 91.9 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_35cbedc9-e9d9-4aa1-9854-a37c4f5059d2.jpg ADDED

Git LFS Details

  • SHA256: ea2d047136fa9c7ee8a20061e78c5b122ed8b3203b7cfeea7e0278dc84687d47
  • Pointer size: 130 Bytes
  • Size of remote file: 68.5 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_3b08aea7-e16b-435c-a371-3014eaf3cce2.jpg ADDED

Git LFS Details

  • SHA256: bc57da0ca6c945f1ed3cee834e1562207d84d8573202aaf783cf41e8efab5464
  • Pointer size: 131 Bytes
  • Size of remote file: 101 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_3e1a129e-c8ca-4d0f-a13e-23b9f6cf7a83.jpg ADDED

Git LFS Details

  • SHA256: 31b137bfada34000dd608af197763dacefc8c82d576b41cb6961e1be46f44790
  • Pointer size: 131 Bytes
  • Size of remote file: 104 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_40a591be-7cfa-49ae-a0cf-d98db2024a20.jpg ADDED

Git LFS Details

  • SHA256: 8cf48e446e71ce995ccd4be7a31cc56136e2ad76203e2983a0807d1aac16cd46
  • Pointer size: 130 Bytes
  • Size of remote file: 84.9 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4104b4e8-ad56-4155-a659-a157f9feb86f.jpg ADDED

Git LFS Details

  • SHA256: cb3ec7706f543763d178b69bd0a604c76945d9ea3518da426939b7e32b36d31f
  • Pointer size: 131 Bytes
  • Size of remote file: 103 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_411335c9-2e46-4a31-814a-4034d145424c.jpg ADDED

Git LFS Details

  • SHA256: 20970ae26fabc5af4f412bebeea7d76caa364a818da588a6bf581753cbb5303b
  • Pointer size: 130 Bytes
  • Size of remote file: 55.8 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_44ae542c-497b-4d98-9884-f54a9a6f6488.jpg ADDED

Git LFS Details

  • SHA256: 9d84d8c8435c4bab6756f6810f05196e7ca4395e6840df6c24032f6caa60352e
  • Pointer size: 131 Bytes
  • Size of remote file: 116 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_44bcd3cf-989e-477d-b127-9d4a417d190e.jpg ADDED

Git LFS Details

  • SHA256: 562dd45633a958ef8dd44914145dc5157fe2e5e2b7ebace8e4e7a20b1bd932f1
  • Pointer size: 130 Bytes
  • Size of remote file: 65.1 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4d7d37d6-b68e-47da-a2a9-8b41f55d9c06.jpg ADDED

Git LFS Details

  • SHA256: 4000cbb1b90c1f7c46b982e0d485642eff395a2609df6fd61cb8f7ece722b056
  • Pointer size: 130 Bytes
  • Size of remote file: 82.7 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4e337333-1e63-4c4d-9b4a-959e6a4646c2.jpg ADDED

Git LFS Details

  • SHA256: 159c0bede78169643d975d1ddaaee83f777f5bdb082d6803ab3d454d6a876e53
  • Pointer size: 131 Bytes
  • Size of remote file: 134 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_4fa380b4-88bc-48d1-a7ad-878843694a79.jpg ADDED

Git LFS Details

  • SHA256: 352becf98ca4e4a08c923013df7f63c9e6aab940363dbaeb664429024de77bb6
  • Pointer size: 131 Bytes
  • Size of remote file: 145 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_50344dd3-f5eb-4dc4-bdd1-191296b4855e.jpg ADDED

Git LFS Details

  • SHA256: bc51e4b619e017dc8d8151c9c452c549d7f8b54fa4b4f2e6289a9420de2cae05
  • Pointer size: 131 Bytes
  • Size of remote file: 140 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_509db6ca-4b04-4d4c-b6e3-ee6b7a1545b1.jpg ADDED

Git LFS Details

  • SHA256: e9684e9b159370f4b585a7ce7be1e2ac6cd387cb8760672c1f3e9fa580471bec
  • Pointer size: 130 Bytes
  • Size of remote file: 98.4 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_519f343c-b503-49d7-8c01-579684ad01cd.jpg ADDED

Git LFS Details

  • SHA256: f1cab44915ebd9c207caa9f9a87e4150a07daddfd086513658cd67ab18743f19
  • Pointer size: 130 Bytes
  • Size of remote file: 90.4 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_528976d4-813b-4f9f-9010-6718471cb6ce.jpg ADDED

Git LFS Details

  • SHA256: 5049d9eacf0eba9413f09769331b96bba49b80b61c9af624980c73d9333d534a
  • Pointer size: 130 Bytes
  • Size of remote file: 87.2 kB
image/EmoSet_images/P1Hard_P4Dynamic_E1N-Risky/image_55a8909e-9f90-4d1b-abc5-88a927b30f3d.jpg ADDED

Git LFS Details

  • SHA256: e2dc23f48e0963a92da6b8b51e094026300f82b3a663d29fe574bb104ff7dd02
  • Pointer size: 130 Bytes
  • Size of remote file: 94.8 kB