Pthahnix commited on
Commit
efcae34
·
verified ·
1 Parent(s): 3fb71ab

docs: update repo structure, fix Quick Start, add tar.gz to Data section

Browse files
Files changed (1) hide show
  1. README.md +16 -11
README.md CHANGED
@@ -89,14 +89,14 @@ This HuggingFace repo stores **checkpoints** and **processed datasets** for repr
89
 
90
  ### Data
91
 
92
- | Directory | Size | Contents |
93
- |-----------|------|----------|
94
- | `data/patches/5cat/` | ~82 MB | 5-category NPZ patch files (train/test splits) |
95
- | `data/patches/lvis_wide/` | ~1 GB | LVIS-Wide NPZ patches (188K train / 45K test / 12K unseen) |
96
  | `data/meshes/` | ~931 MB | Preprocessed decimated OBJ files (5,497 meshes) |
97
- | `data/objaverse/` | ~2 MB | Download manifests (can recreate download pipeline) |
98
 
99
- The processed data can be downloaded directlyno need to re-download from Objaverse and re-run preprocessing.
100
 
101
  ## Core Hypothesis
102
 
@@ -150,7 +150,7 @@ Training data sourced from [Objaverse-LVIS](https://huggingface.co/datasets/alle
150
  ```bash
151
  # Clone the code repo
152
  git clone https://github.com/Pthahnix/MeshLex-Research.git
153
- cd Meshlex-Research
154
 
155
  # Install dependencies
156
  pip install -r requirements.txt
@@ -161,12 +161,17 @@ pip install pyg_lib torch_scatter torch_sparse torch_cluster torch_spline_conv \
161
  # Download processed data from this HF repo
162
  pip install huggingface_hub
163
  python -c "
 
 
 
 
 
 
 
164
  from huggingface_hub import snapshot_download
165
- snapshot_download('Pthahnix/MeshLex-Research', local_dir='hf_download', repo_type='model')
166
  "
167
- # Move data and checkpoints into place
168
- cp -r hf_download/data/ data/
169
- cp -r hf_download/checkpoints/ data/checkpoints/
170
 
171
  # Run evaluation on Exp4 (best model)
172
  PYTHONPATH=. python scripts/evaluate.py \
 
89
 
90
  ### Data
91
 
92
+ | File / Directory | Size | Contents |
93
+ |------------------|------|----------|
94
+ | `data/meshlex_data.tar.gz` | ~1.2 GB | All processed data in one archive (recommended) |
95
+ | `data/patches/` | ~1.1 GB | NPZ patch files (5cat + LVIS-Wide splits) |
96
  | `data/meshes/` | ~931 MB | Preprocessed decimated OBJ files (5,497 meshes) |
97
+ | `data/objaverse/` | ~2 MB | Download manifests |
98
 
99
+ The `tar.gz` archive contains patches, meshes, and manifests download it and extract to skip all preprocessing.
100
 
101
  ## Core Hypothesis
102
 
 
150
  ```bash
151
  # Clone the code repo
152
  git clone https://github.com/Pthahnix/MeshLex-Research.git
153
+ cd MeshLex-Research
154
 
155
  # Install dependencies
156
  pip install -r requirements.txt
 
161
  # Download processed data from this HF repo
162
  pip install huggingface_hub
163
  python -c "
164
+ from huggingface_hub import hf_hub_download
165
+ hf_hub_download('Pthahnix/MeshLex-Research', 'data/meshlex_data.tar.gz', repo_type='model', local_dir='.')
166
+ "
167
+ tar xzf data/meshlex_data.tar.gz -C data/
168
+
169
+ # Download checkpoints
170
+ python -c "
171
  from huggingface_hub import snapshot_download
172
+ snapshot_download('Pthahnix/MeshLex-Research', allow_patterns='checkpoints/*', repo_type='model', local_dir='.')
173
  "
174
+ mv checkpoints data/checkpoints
 
 
175
 
176
  # Run evaluation on Exp4 (best model)
177
  PYTHONPATH=. python scripts/evaluate.py \