iiegn Claude Sonnet 4.5 commited on
Commit
c8f1b06
·
verified ·
1 Parent(s): 287fe11

Disentangle tools/README.md and ADDING_NEW_UD_VERSION.md

Browse files

**tools/README.md:**
- Rewrote as concise quick reference for experienced developers
- Focus on commands, not explanations
- Added "Quick Start" section with complete workflow
- Documented all pipeline scripts (00-05)
- Added directory structure diagram
- Included configuration examples
- Added common operations and troubleshooting
- References ADDING_NEW_UD_VERSION.md for detailed guide

**ADDING_NEW_UD_VERSION.md:**
- Maintained as comprehensive guide for new contributors
- Updated all outdated script references:
- 02_traverse_ud_repos.py → 02_generate_metadata.py
- 03_fill_universal_dependencies_tamplate.py → 03_generate_README.py
- Removed all references to universal_dependencies.py (no longer exists)
- Added explanations for each step ("What is...?", "Why...?")
- Kept troubleshooting section with detailed solutions
- Kept timeline estimates and checklist
- References tools/README.md for technical details
- Updated to reflect v2.0 architecture:
- Parquet-only format (no Python script loader)
- External ud-hf-parquet-tools library
- Blocked treebanks handling

**Key improvements:**
- No duplication between files
- Clear separation of concerns (quick ref vs. guide)
- All references to current architecture
- Cross-references between documents

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>

Files changed (2) hide show
  1. ADDING_NEW_UD_VERSION.md +316 -175
  2. tools/README.md +185 -129
ADDING_NEW_UD_VERSION.md CHANGED
@@ -2,12 +2,15 @@
2
 
3
  This guide explains how to add a new Universal Dependencies release (e.g., UD 2.18, 2.19, etc.) to the `commul/universal_dependencies` HuggingFace dataset.
4
 
 
 
5
  ## Prerequisites
6
 
7
  - Git repository cloned and up to date
8
- - Python environment with dependencies installed:
 
9
  ```bash
10
- pip install -r requirements.txt # or use uv
11
  ```
12
  - Access to push to `commul/universal_dependencies` on HuggingFace Hub
13
  - `huggingface-cli` installed and authenticated:
@@ -18,7 +21,12 @@ This guide explains how to add a new Universal Dependencies release (e.g., UD 2.
18
 
19
  ## Overview
20
 
21
- Each UD version (2.7, 2.8, ..., 2.17, 2.18, ...) has its own git branch. The loader version (v2.0) is consistent across branches. When a new UD release is published, you create a new branch and run the pipeline to generate dataset files.
 
 
 
 
 
22
 
23
  ## Step-by-Step Guide
24
 
@@ -31,7 +39,7 @@ For this example, we'll add **UD 2.18** (replace with actual version).
31
  ### 2. Create New Branch
32
 
33
  ```bash
34
- # Ensure you're on the latest main/template branch
35
  git checkout main
36
  git pull origin main
37
 
@@ -44,270 +52,359 @@ git pull origin 2.17
44
  git checkout -b 2.18
45
  ```
46
 
 
 
47
  ### 3. Update Environment Configuration
48
 
49
  ```bash
50
  cd tools
51
 
52
- # Create or update .env file
53
- echo "UD_VER=2.18" > .env
 
54
 
55
  # Verify
56
  cat .env
57
  # Output: UD_VER=2.18
58
  ```
59
 
60
- ### 4. Fetch Metadata for New Version
 
 
61
 
62
  ```bash
63
- # Fetch citation and description from LINDAT/CLARIN
64
- python 00_fetch_ud_clarin-dspace_metadata.py
65
 
66
  # This creates:
67
  # - etc/citation-2.18
68
  # - etc/description-2.18
69
  ```
70
 
71
- **Note:** You may need to update the LINDAT handle ID in the script if the UD project uses a new handle for this release. Check the [UD release page](https://universaldependencies.org/) for the correct handle.
 
 
 
 
 
 
 
 
 
72
 
73
  ### 5. Fetch Language Codes and Flags
74
 
75
  ```bash
76
- # Fetch codes_and_flags.yaml for new version
77
- ./00_fetch_ud_codes_and_flags.sh
78
 
79
  # This creates:
80
  # - etc/codes_and_flags-2.18.yaml
81
- # - etc/codes_and_flags-latest.yaml (symlink)
82
  ```
83
 
84
- **Note:** You may need to update the git commit hash mapping in the script if a new UD version is released.
 
 
 
 
 
 
 
 
 
85
 
86
- ### 6. Fetch UD Repositories
87
 
88
  ```bash
89
- # This discovers all UD repositories on GitHub
90
  ./01_fetch_ud_repos.sh
91
 
92
  # This creates:
93
- # - .UD_submodules_add.commands
 
 
 
 
 
94
 
95
- # The script will tell you to run commands in UD_repos/
96
- # Follow those instructions:
97
  cd UD_repos
98
 
99
- # Initialize git if not already done
100
  git init
101
 
102
- # Add submodules (this may take a while - 289 repositories)
103
  bash ../.UD_submodules_add.commands
104
 
105
- # Checkout the new release tag (e.g., r2.18)
106
- git submodule foreach 'git fetch --tags && git checkout r2.18 && touch .tag-r2.18'
 
 
 
 
 
107
 
108
  cd ..
109
  ```
110
 
111
  **Expected time:** 30-60 minutes depending on network speed.
112
 
113
- ### 7. Traverse Repositories and Extract Metadata
114
 
 
115
  ```bash
116
- # Extract metadata from all treebanks
117
- python 02_traverse_ud_repos.py
 
 
 
 
 
 
118
 
119
  # This creates:
120
- # - metadata-2.18.json (contains info for all 339+ treebanks)
121
 
122
  # Verify the output
123
- ls -lh metadata-2.18.json
124
  # Should be ~200-300 KB
125
 
126
  # Quick check: count treebanks
127
- python -c "import json; print(len(json.load(open('metadata-2.18.json'))))"
128
- # Should be 339+ (may increase with new treebanks)
129
  ```
130
 
131
- ### 8. Generate Dataset Loader Script
 
 
 
 
 
 
 
 
 
 
132
 
133
  ```bash
134
- # Generate universal_dependencies-2.18 from template
135
- python 03_fill_universal_dependencies_tamplate.py
136
 
137
  # This creates:
138
- # - universal_dependencies-2.18 (Python loader script)
139
- # - README-2.18 (dataset card)
140
 
141
- # Verify files were created
142
- ls -lh universal_dependencies-2.18 README-2.18
143
  ```
144
 
145
- ### 9. Generate Parquet Files
 
 
 
 
146
 
147
  ```bash
148
- # Test with a few treebanks first
149
- python 04_generate_parquet.py --test
 
 
 
 
 
 
 
 
 
 
150
 
151
- # If successful, generate for all treebanks (takes 2-4 hours)
152
- python 04_generate_parquet.py
 
 
 
 
 
 
153
 
154
  # This creates:
155
- # - parquet/{treebank_name}/{split}.parquet for all 339+ treebanks
156
 
157
  # Verify output
158
  du -sh ../parquet/
159
  # Should be ~50-80 GB total
160
  ```
161
 
162
- **Optional:** Run on a subset first to verify correctness:
 
 
 
 
 
 
 
 
 
 
163
  ```bash
164
- python 04_generate_parquet.py --treebanks "en_ewt,fr_gsd,de_gsd"
 
 
 
 
 
 
 
165
  ```
166
 
167
- ### 10. Copy Files to Repository Root
 
 
 
 
168
 
169
  ```bash
170
- # Copy generated files to root
171
  cd .. # Back to repository root
172
 
173
- cp tools/universal_dependencies-2.18 universal_dependencies.py
174
- cp tools/README-2.18 README.md
175
- cp tools/metadata-2.18.json metadata.json
176
 
177
  # Verify files are in place
178
- ls -lh universal_dependencies.py README.md metadata.json
179
  ```
180
 
181
- ### 11. Test the Dataset Loader
182
 
183
- ```bash
184
- # Test loading with Python script (for backwards compatibility testing)
185
- python -c "
186
- from datasets import load_dataset
187
- import sys
188
- sys.path.insert(0, '.')
189
 
190
- # Test a small treebank
191
- ds = load_dataset('./universal_dependencies.py', 'en_pronouns', split='test')
192
- print(f'Loaded {len(ds)} examples')
193
- print(f'Features: {list(ds.features.keys())}')
194
- print(f'MWT field present: {\"mwt\" in ds.features}')
195
- "
196
 
 
197
  # Test loading from Parquet
198
  python -c "
199
  from datasets import load_dataset
200
 
201
- # Test Parquet loading
202
- ds = load_dataset('parquet', data_files='parquet/en_ewt/train.parquet')
203
- print(f'Loaded {len(ds[\"train\"])} examples from Parquet')
 
 
204
  "
205
  ```
206
 
207
- ### 12. Commit Changes to Git
 
 
 
 
 
 
 
208
 
209
  ```bash
210
  # Add generated files
211
- git add universal_dependencies.py
212
- git add README.md
213
- git add metadata.json
214
- git add tools/metadata-2.18.json
215
- git add tools/universal_dependencies-2.18
216
- git add tools/README-2.18
217
- git add tools/etc/citation-2.18
218
- git add tools/etc/description-2.18
219
- git add tools/etc/codes_and_flags-2.18.yaml
220
  git add tools/.env
221
 
222
  # Commit with descriptive message
223
- git commit -m "Add UD 2.18 data with loader v2.0
224
 
225
- - Generated from Universal Dependencies 2.18 release
226
  - 339+ treebanks across 186+ languages
227
- - Includes Parquet files for efficient loading
228
- - Loader version: 2.0.0
229
- - MWT support and bug fixes included
230
 
231
  Generated files:
232
- - universal_dependencies.py (loader script)
233
  - README.md (dataset card)
234
  - metadata.json (treebank metadata)
235
- - Parquet files in parquet/ directory"
236
 
237
  # Tag the commit
238
- git tag -a ud2.18-loader-v2.0 -m "UD 2.18 with Loader v2.0"
239
 
240
  # Push branch and tags
241
- git push origin 2.18
242
  git push origin --tags
243
  ```
244
 
245
- ### 13. Validate Parquet Files
246
 
247
- ```bash
248
- # Test validation on 3 treebanks first
249
- python 05_validate_parquet.py --test
250
 
251
- # If successful, validate all treebanks (optional, takes ~30-60 minutes)
252
- python 05_validate_parquet.py
 
253
 
254
- # This validates:
255
- # - Data loads correctly from HuggingFace Hub
256
- # - All fields round-trip correctly
257
- # - MWT information is preserved
258
- # - Token sequences exclude MWT forms (v2.0 bug fix)
259
  ```
260
 
261
- ### 14. Upload to HuggingFace Hub
262
 
263
  ```bash
264
- # Upload Parquet files to HuggingFace (this may take several hours)
265
- # Note: If using git-lfs, commit and push instead:
266
- git add parquet/
267
- git commit -m "Add Parquet files for all treebanks"
268
- git push origin 2.18
269
-
270
- # Alternative: Use huggingface-cli (if not using git-lfs)
271
- huggingface-cli upload commul/universal_dependencies ./parquet --repo-type dataset --revision 2.18
272
-
273
- # Upload main files
274
- huggingface-cli upload commul/universal_dependencies ./universal_dependencies.py --repo-type dataset --revision 2.18
275
- huggingface-cli upload commul/universal_dependencies ./README.md --repo-type dataset --revision 2.18
276
- huggingface-cli upload commul/universal_dependencies ./metadata.json --repo-type dataset --revision 2.18
277
-
278
- # Alternatively, push everything at once (if you have the repo cloned with git-lfs)
279
- # git push hf 2.18
280
  ```
281
 
282
  **Expected upload time:** 2-6 hours depending on network speed and HuggingFace server load.
283
 
284
- ### 15. Verify on HuggingFace Hub
 
 
285
 
286
  Visit: https://huggingface.co/datasets/commul/universal_dependencies
287
 
288
- 1. Check that branch `2.18` exists in the "Branches" dropdown
289
- 2. Verify files are present:
290
- - `universal_dependencies.py`
291
- - `README.md`
292
  - `metadata.json`
293
  - `parquet/` directory with subdirectories
 
 
294
 
295
- 3. Test loading:
296
- ```python
297
- from datasets import load_dataset
298
 
299
- # Load from new version
300
- ds = load_dataset("commul/universal_dependencies", "en_ewt", revision="2.18")
301
- print(ds)
302
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
303
 
304
- ### 16. Update Dataset Card (Optional)
305
 
306
- If this is now the latest version, you may want to update the dataset card to mention it:
 
 
 
 
307
 
308
- 1. Edit README.md to add "Latest version: 2.18" at the top
309
- 2. Update version badges if any
310
- 3. Commit and push
311
 
312
  ## Troubleshooting
313
 
@@ -321,15 +418,25 @@ cd tools/UD_repos
321
  git submodule foreach 'git fetch --tags && (git checkout r2.18 || git checkout main) && touch .tag-r2.18'
322
  ```
323
 
 
 
324
  ### Issue: Metadata extraction fails for a treebank
325
 
326
  **Problem:** A treebank is malformed or missing expected files.
327
 
 
 
 
 
 
 
 
328
  **Solution:**
329
- - Check the specific treebank in `UD_repos/UD_{Language}-{Treebank}/`
330
- - Verify it has `.conllu` files and `stats.xml`
331
- - Skip problematic treebanks by editing `02_traverse_ud_repos.py` if necessary
332
- - Report issues to the UD project
 
333
 
334
  ### Issue: Parquet generation fails for a treebank
335
 
@@ -337,69 +444,103 @@ git submodule foreach 'git fetch --tags && (git checkout r2.18 || git checkout m
337
 
338
  **Solution:**
339
  ```bash
340
- # Generate Parquet in batches to isolate the problem
341
- python 04_generate_parquet.py --treebanks "en_ewt" # Test one at a time
342
-
343
- # Check logs for specific error
344
- # Fix the problematic CoNLL-U file or skip it temporarily
 
 
 
345
  ```
346
 
 
 
347
  ### Issue: HuggingFace upload is very slow
348
 
349
  **Problem:** Large Parquet files + network latency.
350
 
351
  **Solution:**
352
  - Use a machine with better network connection
353
- - Upload during off-peak hours
354
- - Use `--num-workers` flag if available:
355
- ```bash
356
- huggingface-cli upload commul/universal_dependencies ./parquet --repo-type dataset --revision 2.18 --num-workers 4
357
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
358
 
359
  ## Checklist
360
 
361
  Before marking the release as complete:
362
 
363
- - [ ] All metadata files generated (`metadata-2.18.json`, `citation-2.18`, `description-2.18`)
364
- - [ ] Universal dependencies script generated (`universal_dependencies-2.18`)
365
- - [ ] README generated (`README-2.18`)
366
- - [ ] Parquet files generated for all treebanks
367
- - [ ] Files copied to repository root
368
- - [ ] Tested loading from Python script
369
- - [ ] Tested loading from Parquet
370
- - [ ] Committed to git with proper message
371
- - [ ] Tagged with `ud2.18-loader-v2.0`
 
 
372
  - [ ] Pushed to origin
373
  - [ ] Uploaded to HuggingFace Hub
374
- - [ ] Verified on HuggingFace Hub
375
- - [ ] Tested loading from HuggingFace Hub
376
 
377
  ## Timeline Estimate
378
 
379
- | Step | Time | Can be Parallelized? |
380
- |------|------|---------------------|
381
- | 1-5: Setup & metadata | 5-10 min | No |
382
- | 6: Fetch repositories | 30-60 min | No |
383
- | 7: Extract metadata | 10-20 min | No |
384
- | 8: Generate script | 1-2 min | No |
385
- | 9: Generate Parquet | 2-4 hours | Yes (by treebank) |
386
- | 10-12: Commit to git | 5-10 min | No |
387
- | 13: Upload to HF Hub | 2-6 hours | Partially |
388
- | 14-15: Verify & update | 10-20 min | No |
389
- | **Total** | **~5-11 hours** | |
390
 
391
- **Recommendation:** Start the process in the morning so uploads can complete during the day.
392
 
393
  ## Notes
394
 
395
- - The loader version (v2.0) is already in the templates, so new UD versions automatically get the latest loader features.
396
- - Each UD version branch is independent - you can maintain multiple versions simultaneously.
397
- - Old branches (2.7-2.17) can be upgraded to v2.0 loader using the same process (just checkout the old branch and regenerate files).
398
- - The `main` branch can serve as a template or point to the latest version.
 
 
 
 
 
 
 
 
 
399
 
400
  ## Support
401
 
402
  For issues:
403
- - Universal Dependencies data issues → [UD GitHub Issues](https://github.com/UniversalDependencies/docs/issues)
404
- - Loader/tooling issues → Your repository issues
405
- - HuggingFace Hub issues → [HuggingFace Community Forums](https://discuss.huggingface.co/)
 
2
 
3
  This guide explains how to add a new Universal Dependencies release (e.g., UD 2.18, 2.19, etc.) to the `commul/universal_dependencies` HuggingFace dataset.
4
 
5
+ **Quick reference:** See [tools/README.md](tools/README.md) for concise commands and script documentation.
6
+
7
  ## Prerequisites
8
 
9
  - Git repository cloned and up to date
10
+ - Python 3.12+ with `uv` installed
11
+ - Dependencies installed:
12
  ```bash
13
+ pip install ud-hf-parquet-tools pyyaml python-dotenv jinja2
14
  ```
15
  - Access to push to `commul/universal_dependencies` on HuggingFace Hub
16
  - `huggingface-cli` installed and authenticated:
 
21
 
22
  ## Overview
23
 
24
+ Each UD version (2.7, 2.8, ..., 2.17, 2.18, ...) has its own git branch. The dataset uses Parquet format (v2.0 architecture) for all versions. When a new UD release is published, you create a new branch and run the generation pipeline.
25
+
26
+ **Architecture:**
27
+ - **No Python script loader**: Dataset uses Parquet files only (datasets >=4.0.0)
28
+ - **External tools**: Helper functions in separate `ud-hf-parquet-tools` library
29
+ - **Blocked treebanks**: Some treebanks excluded due to license restrictions
30
 
31
  ## Step-by-Step Guide
32
 
 
39
  ### 2. Create New Branch
40
 
41
  ```bash
42
+ # Ensure you're on the latest main branch
43
  git checkout main
44
  git pull origin main
45
 
 
52
  git checkout -b 2.18
53
  ```
54
 
55
+ **Why branching?** Each UD version is maintained independently, allowing users to load specific versions via `revision="2.18"`.
56
+
57
  ### 3. Update Environment Configuration
58
 
59
  ```bash
60
  cd tools
61
 
62
+ # Set the version number
63
+ export NEW_VER=2.18
64
+ echo "UD_VER=${NEW_VER}" > .env
65
 
66
  # Verify
67
  cat .env
68
  # Output: UD_VER=2.18
69
  ```
70
 
71
+ **What is .env?** Environment file that all scripts read to determine which UD version to process.
72
+
73
+ ### 4. Fetch Metadata from LINDAT/CLARIN
74
 
75
  ```bash
76
+ # Fetch citation and description
77
+ ./00_fetch_ud_clarin-dspace_metadata.py -o
78
 
79
  # This creates:
80
  # - etc/citation-2.18
81
  # - etc/description-2.18
82
  ```
83
 
84
+ **Before running**, update the script to add the new version's handle ID:
85
+
86
+ 1. Open `00_fetch_ud_clarin-dspace_metadata.py`
87
+ 2. Find the `url_postfixes` dictionary
88
+ 3. Add entry for new version:
89
+ ```python
90
+ "2.18": "11234/1-XXXX", # Check UD website for correct handle
91
+ ```
92
+
93
+ **Where to find handle?** Visit the [UD release page](https://universaldependencies.org/) and check the LINDAT citation link.
94
 
95
  ### 5. Fetch Language Codes and Flags
96
 
97
  ```bash
98
+ # Fetch language metadata
99
+ ./00_fetch_ud_codes_and_flags.sh -o
100
 
101
  # This creates:
102
  # - etc/codes_and_flags-2.18.yaml
103
+ # - etc/codes_and_flags-latest.yaml (updated symlink)
104
  ```
105
 
106
+ **Before running**, update the script with the docs-automation commit hash:
107
+
108
+ 1. Open `00_fetch_ud_codes_and_flags.sh`
109
+ 2. Find the `VER_MAPPING` associative array
110
+ 3. Add entry:
111
+ ```bash
112
+ VER_MAPPING["2.18"]="<git-commit-hash-for-2.18>"
113
+ ```
114
+
115
+ **How to find hash?** Check the [UD docs-automation releases](https://github.com/UniversalDependencies/docs/releases) for the commit tagged with the version.
116
 
117
+ ### 6. Discover UD Repositories
118
 
119
  ```bash
120
+ # Generate list of all UD repositories
121
  ./01_fetch_ud_repos.sh
122
 
123
  # This creates:
124
+ # - .UD_submodules_add.commands (list of git submodule add commands)
125
+ ```
126
+
127
+ **What does this do?** Queries GitHub API for all repositories in the UniversalDependencies organization and generates commands to add them as submodules.
128
+
129
+ ### 7. Fetch UD Repositories as Submodules
130
 
131
+ ```bash
 
132
  cd UD_repos
133
 
134
+ # Initialize git repository (if first time)
135
  git init
136
 
137
+ # Add all UD repositories as submodules
138
  bash ../.UD_submodules_add.commands
139
 
140
+ # Checkout the new release tag in all submodules
141
+ git submodule foreach "git fetch --tags && git checkout r${NEW_VER} && touch .tag-r${NEW_VER}"
142
+
143
+ # Create branch and commit
144
+ git checkout -b ${NEW_VER}
145
+ git add -A
146
+ git commit -m "Add UD ${NEW_VER} repositories"
147
 
148
  cd ..
149
  ```
150
 
151
  **Expected time:** 30-60 minutes depending on network speed.
152
 
153
+ **What are .tag-r{VER} files?** Marker files that `02_generate_metadata.py` checks to ensure a repository has the correct version tag.
154
 
155
+ **Troubleshooting:** If some repositories don't have the tag:
156
  ```bash
157
+ git submodule foreach "git fetch --tags && (git checkout r${NEW_VER} || git checkout main) && touch .tag-r${NEW_VER}"
158
+ ```
159
+
160
+ ### 8. Extract Metadata from Treebanks
161
+
162
+ ```bash
163
+ # Generate metadata from all treebank directories
164
+ ./02_generate_metadata.py -o
165
 
166
  # This creates:
167
+ # - metadata-2.18.json (contains info for all treebanks)
168
 
169
  # Verify the output
170
+ ls -lh metadata-${NEW_VER}.json
171
  # Should be ~200-300 KB
172
 
173
  # Quick check: count treebanks
174
+ python -c "import json; print(len(json.load(open('metadata-${NEW_VER}.json'))))"
175
+ # Should be 339+ (number increases with new treebanks)
176
  ```
177
 
178
+ **What does this script do?**
179
+ - Reads README files from each treebank
180
+ - Extracts summaries, licenses, genres
181
+ - Collects statistics from stats.xml
182
+ - Identifies available splits (train/dev/test)
183
+ - Checks `blocked_treebanks.yaml` for license restrictions
184
+ - Adds "blocked" property to metadata
185
+
186
+ **Expected time:** 5-10 minutes
187
+
188
+ ### 9. Generate Dataset Card (README)
189
 
190
  ```bash
191
+ # Generate HuggingFace dataset card
192
+ ./03_generate_README.py -o
193
 
194
  # This creates:
195
+ # - README-2.18 (dataset card for HuggingFace)
 
196
 
197
+ # Verify file was created
198
+ ls -lh README-${NEW_VER}
199
  ```
200
 
201
+ **What does this do?** Renders `templates/README.tmpl` with metadata, citation, and description to create the HuggingFace dataset card.
202
+
203
+ ### 10. Review Blocked Treebanks
204
+
205
+ Before generating Parquet files, review the blocked treebanks:
206
 
207
  ```bash
208
+ # Check blocked treebanks list
209
+ cat blocked_treebanks.yaml
210
+
211
+ # Example entry:
212
+ # pt_cintil:
213
+ # reason: "Restrictive license prohibits redistribution in derived formats"
214
+ # license: "CC BY-NC-SA 4.0"
215
+ ```
216
+
217
+ **Why block treebanks?** Some treebanks have licenses (e.g., CC BY-NC-SA) that prohibit redistribution in modified formats like Parquet.
218
+
219
+ **See also:** [tools/BLOCKED_TREEBANKS.md](tools/BLOCKED_TREEBANKS.md)
220
 
221
+ ### 11. Generate Parquet Files
222
+
223
+ ```bash
224
+ # Test with 3 treebanks first
225
+ uv run ./04_generate_parquet.py --test
226
+
227
+ # If successful, generate all treebanks (takes 2-4 hours)
228
+ uv run ./04_generate_parquet.py
229
 
230
  # This creates:
231
+ # - ../parquet/{treebank_name}/{split}.parquet for all treebanks
232
 
233
  # Verify output
234
  du -sh ../parquet/
235
  # Should be ~50-80 GB total
236
  ```
237
 
238
+ **What does this do?** Wrapper script that calls the `ud-hf-parquet-tools` library to convert CoNLL-U files to Parquet format.
239
+
240
+ **Options:**
241
+ - `--test`: Generate only 3 treebanks (quick test)
242
+ - `--overwrite`: Regenerate existing files
243
+ - `--blocked-treebanks`: Path to YAML file with blocked treebanks
244
+
245
+ **Expected time:** 2-4 hours for all treebanks
246
+
247
+ ### 12. Validate Parquet Files
248
+
249
  ```bash
250
+ # Test validation on 3 treebanks
251
+ uv run ./05_validate_parquet.py --local --test
252
+
253
+ # Full validation (optional, takes ~30-60 minutes)
254
+ uv run ./05_validate_parquet.py --local --mode text -vv > /tmp/parquet-check.log
255
+
256
+ # Check for errors (excluding metadata comments)
257
+ grep -E " [+-]" /tmp/parquet-check.log | grep -vE " [+-]#"
258
  ```
259
 
260
+ **What does this do?** Compares Parquet output to original CoNLL-U to verify 100% data fidelity.
261
+
262
+ **Expected output:** No differences except in comment metadata (which may vary slightly).
263
+
264
+ ### 13. Copy Files to Repository Root
265
 
266
  ```bash
 
267
  cd .. # Back to repository root
268
 
269
+ # Copy generated files
270
+ cp tools/README-${NEW_VER} README.md
271
+ cp tools/metadata-${NEW_VER}.json metadata.json
272
 
273
  # Verify files are in place
274
+ ls -lh README.md metadata.json parquet/
275
  ```
276
 
277
+ **Why copy to root?** HuggingFace Hub expects these files at the repository root for the dataset to work.
278
 
279
+ ### 14. Test Dataset Loading
 
 
 
 
 
280
 
281
+ Test that the dataset loads correctly:
 
 
 
 
 
282
 
283
+ ```bash
284
  # Test loading from Parquet
285
  python -c "
286
  from datasets import load_dataset
287
 
288
+ # Test a small treebank
289
+ ds = load_dataset('parquet', data_files='parquet/en_pronouns/test.parquet')
290
+ print(f'Loaded {len(ds[\"train\"])} examples')
291
+ print(f'Features: {list(ds[\"train\"].features.keys())}')
292
+ print(f'MWT field present: {\"mwt\" in ds[\"train\"].features}')
293
  "
294
  ```
295
 
296
+ **Expected output:**
297
+ ```
298
+ Loaded X examples
299
+ Features: ['sent_id', 'text', 'comments', 'tokens', 'lemmas', 'upos', 'xpos', 'feats', 'head', 'deprel', 'deps', 'misc', 'mwt', 'empty_nodes']
300
+ MWT field present: True
301
+ ```
302
+
303
+ ### 15. Commit Changes to Git
304
 
305
  ```bash
306
  # Add generated files
307
+ git add README.md metadata.json parquet/
308
+ git add tools/metadata-${NEW_VER}.json
309
+ git add tools/README-${NEW_VER}
310
+ git add tools/etc/citation-${NEW_VER}
311
+ git add tools/etc/description-${NEW_VER}
312
+ git add tools/etc/codes_and_flags-${NEW_VER}.yaml
 
 
 
313
  git add tools/.env
314
 
315
  # Commit with descriptive message
316
+ git commit -m "Add UD ${NEW_VER} data with Parquet format
317
 
318
+ - Generated from Universal Dependencies ${NEW_VER} release
319
  - 339+ treebanks across 186+ languages
320
+ - Parquet format for efficient loading (datasets >=4.0.0)
321
+ - Blocked treebanks excluded per license restrictions
322
+ - Helper functions available in ud-hf-parquet-tools library
323
 
324
  Generated files:
 
325
  - README.md (dataset card)
326
  - metadata.json (treebank metadata)
327
+ - parquet/ directory with all treebank splits"
328
 
329
  # Tag the commit
330
+ git tag -a ud${NEW_VER} -m "Universal Dependencies ${NEW_VER} release"
331
 
332
  # Push branch and tags
333
+ git push origin ${NEW_VER}
334
  git push origin --tags
335
  ```
336
 
337
+ ### 16. Upload to HuggingFace Hub
338
 
339
+ **Option A: Using git-lfs (Recommended)**
340
+
341
+ If you've cloned the HuggingFace repository with git-lfs:
342
 
343
+ ```bash
344
+ # Add HF Hub as remote (if not already)
345
+ git remote add hf https://huggingface.co/datasets/commul/universal_dependencies
346
 
347
+ # Push to HuggingFace
348
+ git push hf ${NEW_VER}
349
+ git push hf --tags
 
 
350
  ```
351
 
352
+ **Option B: Using huggingface-cli**
353
 
354
  ```bash
355
+ # Upload entire directory
356
+ huggingface-cli upload commul/universal_dependencies . --repo-type dataset --revision ${NEW_VER}
 
 
 
 
 
 
 
 
 
 
 
 
 
 
357
  ```
358
 
359
  **Expected upload time:** 2-6 hours depending on network speed and HuggingFace server load.
360
 
361
+ **Tip:** Run uploads during off-peak hours for better performance.
362
+
363
+ ### 17. Verify on HuggingFace Hub
364
 
365
  Visit: https://huggingface.co/datasets/commul/universal_dependencies
366
 
367
+ **Checklist:**
368
+ 1. ✅ Branch `2.18` exists in the "Branches" dropdown
369
+ 2. ✅ Files are present:
370
+ - `README.md` (dataset card)
371
  - `metadata.json`
372
  - `parquet/` directory with subdirectories
373
+ 3. ✅ Dataset card displays correctly
374
+ 4. ✅ Files section shows parquet files
375
 
376
+ **Test loading:**
377
+ ```python
378
+ from datasets import load_dataset
379
 
380
+ # Load from new version
381
+ ds = load_dataset("commul/universal_dependencies", "en_ewt", revision="2.18")
382
+ print(ds)
383
+ ```
384
+
385
+ **Expected output:**
386
+ ```
387
+ DatasetDict({
388
+ train: Dataset({
389
+ features: ['sent_id', 'text', 'comments', 'tokens', 'lemmas', ...],
390
+ num_rows: 12544
391
+ })
392
+ dev: Dataset({...})
393
+ test: Dataset({...})
394
+ })
395
+ ```
396
+
397
+ ### 18. Update Main Branch (Optional)
398
 
399
+ If this is now the latest version:
400
 
401
+ ```bash
402
+ git checkout main
403
+ git merge ${NEW_VER}
404
+ git push origin main
405
+ ```
406
 
407
+ This makes the new version the default when users don't specify a revision.
 
 
408
 
409
  ## Troubleshooting
410
 
 
418
  git submodule foreach 'git fetch --tags && (git checkout r2.18 || git checkout main) && touch .tag-r2.18'
419
  ```
420
 
421
+ This falls back to `main` branch for repositories without the tag.
422
+
423
  ### Issue: Metadata extraction fails for a treebank
424
 
425
  **Problem:** A treebank is malformed or missing expected files.
426
 
427
+ **Symptoms:**
428
+ ```
429
+ ITEM DELETED - no summary: UD_Language-Treebank
430
+ ITEM DELETED - no files : UD_Language-Treebank
431
+ ITEM DELETED - no license: UD_Language-Treebank
432
+ ```
433
+
434
  **Solution:**
435
+ 1. Check the specific treebank in `tools/UD_repos/UD_{Language}-{Treebank}/`
436
+ 2. Verify it has `.conllu` files and `stats.xml`
437
+ 3. Check if README has required metadata
438
+ 4. If persistently broken, report to UD project
439
+ 5. Treebank will be automatically excluded from dataset
440
 
441
  ### Issue: Parquet generation fails for a treebank
442
 
 
444
 
445
  **Solution:**
446
  ```bash
447
+ # Isolate the problem by generating one treebank at a time
448
+ uv run ./04_generate_parquet.py --treebanks "en_ewt"
449
+
450
+ # Check error message for details
451
+ # Common issues:
452
+ # - Malformed CoNLL-U syntax
453
+ # - Encoding problems
454
+ # - Invalid character in fields
455
  ```
456
 
457
+ **Report issues:** See [CONLLU_PARSING.md](https://github.com/bot-zen/ud-hf-parquet-tools/blob/main/CONLLU_PARSING.md) in ud-hf-parquet-tools for known parsing edge cases.
458
+
459
  ### Issue: HuggingFace upload is very slow
460
 
461
  **Problem:** Large Parquet files + network latency.
462
 
463
  **Solution:**
464
  - Use a machine with better network connection
465
+ - Upload during off-peak hours (e.g., nighttime UTC)
466
+ - Consider parallel uploads if using huggingface-cli
467
+
468
+ ### Issue: Out of disk space
469
+
470
+ **Problem:** Parquet files take ~50-80 GB.
471
+
472
+ **Solution:**
473
+ - Ensure you have at least 100 GB free space
474
+ - Generate Parquet files on a machine with larger disk
475
+ - Clean up old UD versions: `rm -rf tools/UD_repos/` after uploading
476
+
477
+ ### Issue: Script dependencies not found
478
+
479
+ **Problem:** ImportError or ModuleNotFoundError.
480
+
481
+ **Solution:**
482
+ ```bash
483
+ # Install required packages
484
+ pip install ud-hf-parquet-tools pyyaml python-dotenv jinja2
485
+
486
+ # Or use uv to manage dependencies automatically
487
+ uv run --script ./script.py
488
+ ```
489
 
490
  ## Checklist
491
 
492
  Before marking the release as complete:
493
 
494
+ - [ ] `.env` file updated with new version
495
+ - [ ] Metadata files generated (`citation-{VER}`, `description-{VER}`, `codes_and_flags-{VER}.yaml`)
496
+ - [ ] All UD repositories fetched and checked out to correct tag
497
+ - [ ] `metadata-{VER}.json` generated with blocked treebank info
498
+ - [ ] `README-{VER}` generated
499
+ - [ ] Parquet files generated for all non-blocked treebanks
500
+ - [ ] Parquet files validated (spot check)
501
+ - [ ] Files copied to repository root (`README.md`, `metadata.json`, `parquet/`)
502
+ - [ ] Tested loading from Parquet files
503
+ - [ ] Committed to git with descriptive message
504
+ - [ ] Tagged with `ud{VER}`
505
  - [ ] Pushed to origin
506
  - [ ] Uploaded to HuggingFace Hub
507
+ - [ ] Verified dataset loads from HF Hub
508
+ - [ ] (Optional) Updated main branch if latest version
509
 
510
  ## Timeline Estimate
511
 
512
+ | Step | Time | Notes |
513
+ |------|------|-------|
514
+ | 1-5: Setup & metadata | 10-15 min | Manual edits required |
515
+ | 6-7: Fetch repositories | 30-60 min | Network-dependent |
516
+ | 8-9: Generate metadata/README | 5-10 min | Fast |
517
+ | 10-12: Generate & validate Parquet | 2-4 hours | CPU-intensive |
518
+ | 13-15: Commit to git | 10-15 min | Fast |
519
+ | 16: Upload to HF Hub | 2-6 hours | Network-dependent |
520
+ | 17-18: Verify & update | 10-20 min | Fast |
521
+ | **Total** | **~5-11 hours** | Can parallelize some steps |
 
522
 
523
+ **Recommendation:** Start the process in the morning. Long-running steps (repository fetch, Parquet generation, upload) can run unattended.
524
 
525
  ## Notes
526
 
527
+ - **No Python script loader:** v2.0 architecture uses Parquet files only (no `universal_dependencies.py`)
528
+ - **Helper functions external:** CoNLL-U utilities available in `ud-hf-parquet-tools` library
529
+ - **Blocked treebanks:** Some treebanks excluded due to license restrictions (see `blocked_treebanks.yaml`)
530
+ - **Branch independence:** Each UD version branch is self-contained
531
+ - **Version pinning:** Users can load specific versions via `revision="2.18"`
532
+
533
+ ## Reference Documentation
534
+
535
+ - **Quick reference:** [tools/README.md](tools/README.md) - Script documentation and common operations
536
+ - **Blocked treebanks:** [tools/BLOCKED_TREEBANKS.md](tools/BLOCKED_TREEBANKS.md) - License restrictions
537
+ - **Migration guide:** [MIGRATION.md](MIGRATION.md) - v1.x to v2.0 migration
538
+ - **Parquet tools:** https://github.com/bot-zen/ud-hf-parquet-tools - External library for CoNLL-U processing
539
+ - **CoNLL-U parsing:** https://github.com/bot-zen/ud-hf-parquet-tools/blob/main/CONLLU_PARSING.md - Parsing edge cases
540
 
541
  ## Support
542
 
543
  For issues:
544
+ - **UD data issues** → [UD GitHub Issues](https://github.com/UniversalDependencies/docs/issues)
545
+ - **Tooling issues** → Your repository issues or [ud-hf-parquet-tools issues](https://github.com/bot-zen/ud-hf-parquet-tools/issues)
546
+ - **HuggingFace Hub issues** → [HuggingFace Community Forums](https://discuss.huggingface.co/)
tools/README.md CHANGED
@@ -1,129 +1,185 @@
1
- # README
2
-
3
- ## Add a new version
4
-
5
- Use `NEW_VER` as the placeholder in the steps below (for example `2.17`). Keep
6
- commands idempotent by running scripts with the `-o/--override` flag whenever a
7
- file already exists.
8
-
9
- 1. **Start new branch ${NEW_VER}**
10
- ```
11
- git checkout -b ${NEW_VER}
12
- ```
13
-
14
- 2. **Add PID info**
15
- - Edit `.env` and set `UD_VER=NEW_VER`:
16
- ```
17
- NEW_VER=2.17
18
- echo "UD_VER=${NEW_VER}" > .env
19
- ```
20
- - In `00_fetch_ud_clarin-dspace_metadata.py`, add a new entry to the
21
- `url_postfixes` dict, e.g.
22
- ```
23
- "2.17": "11234/1-6000", # <treebanks/languages>, released <date>
24
- ```
25
- - Fetch the metadata artefacts:
26
- ```
27
- ./00_fetch_ud_clarin-dspace_metadata.py -o
28
- ```
29
-
30
- 3. **Add/update docs-commit mapping**
31
- - In `00_fetch_ud_codes_and_flags.sh`, append the git hash for the new
32
- release to `VER_MAPPING`, e.g.
33
- ```
34
- VER_MAPPING["2.17"]="bea0c0e56c64cad41e60f479a8eca87dc571113f"
35
- ```
36
- - Refresh the codes/flags map and fetch the repositories list:
37
- ```
38
- ./00_fetch_ud_codes_and_flags.sh -o
39
- ./01_fetch_ud_repos.sh
40
- ```
41
- - Commit the newly generated and changed files:
42
- ```
43
- git commit -m "Add PID,docs-commit info for new version" \
44
- tools/00_fetch_ud_clarin-dspace_metadata.py tools/00_fetch_ud_codes_and_flags.sh
45
-
46
- git add etc/codes_and_flags-${NEW_VER}.yaml
47
- git commit -m"Add etc/codes_and_flags-${NEW_VER}.yaml" \
48
- etc/codes_and_flags-${NEW_VER}.yaml
49
-
50
- git commit -m"Bump to ${NEW_VER}: etc/codes_and_flags-latest.yaml" \
51
- etc/codes_and_flags-latest.yaml
52
- ```
53
-
54
- 4. **Create the NEW_VER repositories branch**
55
- ```
56
- cd UD_repos/ # change into the directory with all UD data
57
- git checkout -b ${NEW_VER}
58
- . .UD_submodules_add.commands # update/initialise submodules
59
- git add -u # add new content (!check for new directories/content)
60
- # FIXME:
61
- git submodule foreach git status | grep -v .tag- | grep -v Entering | grep -v HEAD | grep -v Untracked | grep -v '<file>' | grep -v nothing | grep -v master
62
- git commit -m "Bump to ${NEW_VER}"
63
- cd ..
64
- ```
65
-
66
- 5. **Regenerate metadata and README**
67
- ```
68
- ./02_generate_metadata.py -o
69
- ./03_generate_README.py -o
70
- ```
71
- Review the generated files (`README-${NEW_VER}`, `metadata-${NEW_VER}.json`)
72
- before committing.
73
-
74
- 6. **Generate and check parquet files**
75
- (Update blocked_treebanks.yaml - if necessary. See BLOCKED_TREEBANKS.md)
76
- ```
77
- uv run tools/04_generate_parquet.py
78
-
79
- uv run tools/05_validate_parquet.py --local --mode text -vv > /tmp/parquet-check.log
80
- # Check for differences outside the metadata headers:
81
- grep -E " [+-]" /tmp/parquet-check.log | grep -vE " [+-]#"
82
- ```
83
-
84
- 7. **Commit and sync artefacts**
85
- ```
86
- git add README-${NEW_VER} metadata-${NEW_VER}.json
87
- git commit -m "Add README-${NEW_VER} metadata-${NEW_VER}.json"
88
-
89
- cp README-${NEW_VER} ../README.md
90
- cp metadata-${NEW_VER}.json ../metadata.json
91
- git add ../parquet
92
- git commit -m "Bump to ${NEW_VER}: README.md metadata.json parquet/" ../README.md ../metadata.json
93
-
94
- git checkout main
95
- git merge ${NEW_VER}
96
- git push origin ${NEW_VER}
97
- git push origin main
98
- ```
99
- Adjust the final push commands if your workflow requires pull requests or
100
- protected branches.
101
-
102
-
103
- ## Overview of files and directories
104
- 1. **00 (./etc/)**
105
- - `00_fetch_ud_clarin-dspace_metadata.py`
106
- Download `./etc/citation-VER` and `./etc/description-VER` UD metadata files for
107
- the different releases from the lindat clarin-dspace repository.
108
-
109
- - `00_fetch_ud_codes_and_flags.sh`
110
- Download `./etc/codes_and_flags.yaml`, a yaml file mapping UD (long) language
111
- names to metadata from UD `docs-automation` at GitHub.
112
-
113
- 2. **01**
114
- - Identify all relevant UD repositories from GitHub:
115
- * https://github.com/UniversalDependencies/
116
- and generate a list of submodules to add to a (newly created) repository in the
117
- subdirectory `./UD_Repos/`.
118
-
119
- 3. **02**
120
- - Collect relevant metadata from local UD directories:
121
- - extract the '# Summary' from the beginning and machine readable
122
- metadata from the end of the README.{md,txt} file
123
- - use the UD directory name for collecting metadata from the
124
- codes_and_flags.yaml file
125
- - collect {dev,train,test}.conllu files.
126
-
127
- 4. **03**
128
- Use `./templates/README.tmpl` to generate:
129
- - `README-${UD_VER}` (dataset card for HuggingFace)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # tools/ Directory
2
+
3
+ Quick reference for the Universal Dependencies dataset generation pipeline.
4
+
5
+ ## Quick Start: Add New UD Version
6
+
7
+ ```bash
8
+ # Set version
9
+ export NEW_VER=2.18
10
+ cd tools
11
+ echo "UD_VER=${NEW_VER}" > .env
12
+
13
+ # Fetch metadata
14
+ ./00_fetch_ud_clarin-dspace_metadata.py -o
15
+ ./00_fetch_ud_codes_and_flags.sh -o
16
+ ./01_fetch_ud_repos.sh
17
+
18
+ # Initialize repositories
19
+ cd UD_repos
20
+ git init
21
+ bash ../.UD_submodules_add.commands
22
+ git submodule foreach "git fetch --tags && git checkout r${NEW_VER} && touch .tag-r${NEW_VER}"
23
+ cd ..
24
+
25
+ # Generate files
26
+ ./02_generate_metadata.py -o
27
+ ./03_generate_README.py -o
28
+ uv run ./04_generate_parquet.py
29
+ uv run ./05_validate_parquet.py --local --test
30
+
31
+ # Sync to root
32
+ cp README-${NEW_VER} ../README.md
33
+ cp metadata-${NEW_VER}.json ../metadata.json
34
+ git add ../parquet ../README.md ../metadata.json
35
+ ```
36
+
37
+ **For detailed instructions, see:** [ADDING_NEW_UD_VERSION.md](../ADDING_NEW_UD_VERSION.md)
38
+
39
+ ## Pipeline Scripts
40
+
41
+ ### 00 - Fetch Metadata
42
+
43
+ **00_fetch_ud_clarin-dspace_metadata.py**
44
+ - Downloads citation and description from LINDAT/CLARIN repository
45
+ - Creates: `etc/citation-{VER}`, `etc/description-{VER}`
46
+ - Update `url_postfixes` dict for new versions
47
+
48
+ **00_fetch_ud_codes_and_flags.sh**
49
+ - Downloads language codes and flags from UD docs-automation
50
+ - Creates: `etc/codes_and_flags-{VER}.yaml`, `etc/codes_and_flags-latest.yaml`
51
+ - Update `VER_MAPPING` for new versions
52
+
53
+ ### 01 - Fetch Repositories
54
+
55
+ **01_fetch_ud_repos.sh**
56
+ - Discovers all UD repositories on GitHub
57
+ - Creates: `.UD_submodules_add.commands`
58
+ - Run commands in `UD_repos/` directory to fetch all treebanks
59
+
60
+ ### 02 - Generate Metadata
61
+
62
+ **02_generate_metadata.py**
63
+ - Extracts metadata from local UD directories
64
+ - Collects: summaries, licenses, splits, statistics, blocked status
65
+ - Creates: `metadata-{VER}.json`
66
+ - Reads: `blocked_treebanks.yaml` for license restrictions
67
+
68
+ ### 03 - Generate README
69
+
70
+ **03_generate_README.py**
71
+ - Renders Jinja2 template with metadata
72
+ - Creates: `README-{VER}` (HuggingFace dataset card)
73
+ - Uses: `templates/README.tmpl`, `metadata-{VER}.json`, `etc/citation-{VER}`, `etc/description-{VER}`
74
+
75
+ ### 04 - Generate Parquet
76
+
77
+ **04_generate_parquet.py**
78
+ - Wrapper script calling `ud-hf-parquet-tools` library
79
+ - Converts CoNLL-U files to Parquet format
80
+ - Creates: `../parquet/{treebank}/{split}.parquet`
81
+ - Options: `--test`, `--overwrite`, `--blocked-treebanks`
82
+
83
+ ### 05 - Validate Parquet
84
+
85
+ **05_validate_parquet.py**
86
+ - Wrapper script calling `ud-hf-parquet-tools` library
87
+ - Validates Parquet files against CoNLL-U source
88
+ - Options: `--local`, `--test`, `--mode text`
89
+
90
+ ## Directory Structure
91
+
92
+ ```
93
+ tools/
94
+ ├── 00_fetch_ud_clarin-dspace_metadata.py # Fetch citation/description
95
+ ├── 00_fetch_ud_codes_and_flags.sh # Fetch language metadata
96
+ ├── 01_fetch_ud_repos.sh # Discover UD repos
97
+ ├── 02_generate_metadata.py # Extract treebank metadata
98
+ ├── 03_generate_README.py # Generate dataset card
99
+ ├── 04_generate_parquet.py # Generate Parquet files
100
+ ├── 05_validate_parquet.py # Validate Parquet files
101
+ ├── blocked_treebanks.yaml # License-restricted treebanks
102
+ ├── etc/
103
+ │ ├── citation-{VER} # Generated citations
104
+ │ ├── description-{VER} # Generated descriptions
105
+ └── codes_and_flags-{VER}.yaml # Language metadata
106
+ ├── templates/
107
+ │ └── README.tmpl # Jinja2 template for dataset card
108
+ ├── metadata-{VER}.json # Generated metadata (output)
109
+ ├── README-{VER} # Generated dataset card (output)
110
+ └── UD_repos/ # Git submodules (created by 01)
111
+ ├── UD_{Language}-{Treebank}/ # Individual treebanks
112
+ └── .UD_submodules_add.commands # Generated submodule commands
113
+ ```
114
+
115
+ ## Configuration
116
+
117
+ **Environment Variables (.env)**
118
+ ```bash
119
+ UD_VER=2.17 # Current UD version
120
+ ```
121
+
122
+ **Blocked Treebanks (blocked_treebanks.yaml)**
123
+ ```yaml
124
+ pt_cintil:
125
+ reason: "Restrictive license prohibits redistribution in derived formats"
126
+ license: "CC BY-NC-SA 4.0"
127
+ url: "https://github.com/UniversalDependencies/UD_Portuguese-CINTIL"
128
+ ```
129
+
130
+ ## Dependencies
131
+
132
+ - Python 3.12+
133
+ - `uv` (for running scripts with dependencies)
134
+ - `ud-hf-parquet-tools` (for parquet generation/validation)
135
+ - Git with submodules support
136
+
137
+ Install dependencies:
138
+ ```bash
139
+ pip install ud-hf-parquet-tools pyyaml python-dotenv jinja2
140
+ ```
141
+
142
+ ## Common Operations
143
+
144
+ **Test on subset:**
145
+ ```bash
146
+ uv run ./04_generate_parquet.py --test # Tests 3 treebanks
147
+ uv run ./05_validate_parquet.py --test # Validates 3 treebanks
148
+ ```
149
+
150
+ **Force regeneration:**
151
+ ```bash
152
+ ./02_generate_metadata.py -o # Override existing
153
+ ./03_generate_README.py -o
154
+ uv run ./04_generate_parquet.py --overwrite
155
+ ```
156
+
157
+ **Skip blocked treebanks:**
158
+ ```bash
159
+ uv run ./04_generate_parquet.py --blocked-treebanks blocked_treebanks.yaml
160
+ ```
161
+
162
+ ## Troubleshooting
163
+
164
+ **Script not found:**
165
+ - Scripts use `#!/usr/bin/env -S uv run --script` shebang
166
+ - Make executable: `chmod +x *.py *.sh`
167
+ - Or run with: `uv run --script ./script.py`
168
+
169
+ **Missing dependencies:**
170
+ ```bash
171
+ pip install ud-hf-parquet-tools
172
+ ```
173
+
174
+ **Submodule checkout fails:**
175
+ ```bash
176
+ cd UD_repos
177
+ git submodule foreach 'git fetch --tags && (git checkout r2.17 || git checkout main)'
178
+ ```
179
+
180
+ ## Documentation
181
+
182
+ - **Comprehensive guide:** [ADDING_NEW_UD_VERSION.md](../ADDING_NEW_UD_VERSION.md)
183
+ - **Blocked treebanks:** [BLOCKED_TREEBANKS.md](BLOCKED_TREEBANKS.md)
184
+ - **Migration guide:** [MIGRATION.md](../MIGRATION.md)
185
+ - **Parquet tools:** https://github.com/bot-zen/ud-hf-parquet-tools