Datasets:
Disentangle tools/README.md and ADDING_NEW_UD_VERSION.md
Browse files**tools/README.md:**
- Rewrote as concise quick reference for experienced developers
- Focus on commands, not explanations
- Added "Quick Start" section with complete workflow
- Documented all pipeline scripts (00-05)
- Added directory structure diagram
- Included configuration examples
- Added common operations and troubleshooting
- References ADDING_NEW_UD_VERSION.md for detailed guide
**ADDING_NEW_UD_VERSION.md:**
- Maintained as comprehensive guide for new contributors
- Updated all outdated script references:
- 02_traverse_ud_repos.py → 02_generate_metadata.py
- 03_fill_universal_dependencies_tamplate.py → 03_generate_README.py
- Removed all references to universal_dependencies.py (no longer exists)
- Added explanations for each step ("What is...?", "Why...?")
- Kept troubleshooting section with detailed solutions
- Kept timeline estimates and checklist
- References tools/README.md for technical details
- Updated to reflect v2.0 architecture:
- Parquet-only format (no Python script loader)
- External ud-hf-parquet-tools library
- Blocked treebanks handling
**Key improvements:**
- No duplication between files
- Clear separation of concerns (quick ref vs. guide)
- All references to current architecture
- Cross-references between documents
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- ADDING_NEW_UD_VERSION.md +316 -175
- tools/README.md +185 -129
|
@@ -2,12 +2,15 @@
|
|
| 2 |
|
| 3 |
This guide explains how to add a new Universal Dependencies release (e.g., UD 2.18, 2.19, etc.) to the `commul/universal_dependencies` HuggingFace dataset.
|
| 4 |
|
|
|
|
|
|
|
| 5 |
## Prerequisites
|
| 6 |
|
| 7 |
- Git repository cloned and up to date
|
| 8 |
-
- Python
|
|
|
|
| 9 |
```bash
|
| 10 |
-
pip install -
|
| 11 |
```
|
| 12 |
- Access to push to `commul/universal_dependencies` on HuggingFace Hub
|
| 13 |
- `huggingface-cli` installed and authenticated:
|
|
@@ -18,7 +21,12 @@ This guide explains how to add a new Universal Dependencies release (e.g., UD 2.
|
|
| 18 |
|
| 19 |
## Overview
|
| 20 |
|
| 21 |
-
Each UD version (2.7, 2.8, ..., 2.17, 2.18, ...) has its own git branch. The
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
## Step-by-Step Guide
|
| 24 |
|
|
@@ -31,7 +39,7 @@ For this example, we'll add **UD 2.18** (replace with actual version).
|
|
| 31 |
### 2. Create New Branch
|
| 32 |
|
| 33 |
```bash
|
| 34 |
-
# Ensure you're on the latest main
|
| 35 |
git checkout main
|
| 36 |
git pull origin main
|
| 37 |
|
|
@@ -44,270 +52,359 @@ git pull origin 2.17
|
|
| 44 |
git checkout -b 2.18
|
| 45 |
```
|
| 46 |
|
|
|
|
|
|
|
| 47 |
### 3. Update Environment Configuration
|
| 48 |
|
| 49 |
```bash
|
| 50 |
cd tools
|
| 51 |
|
| 52 |
-
#
|
| 53 |
-
|
|
|
|
| 54 |
|
| 55 |
# Verify
|
| 56 |
cat .env
|
| 57 |
# Output: UD_VER=2.18
|
| 58 |
```
|
| 59 |
|
| 60 |
-
|
|
|
|
|
|
|
| 61 |
|
| 62 |
```bash
|
| 63 |
-
# Fetch citation and description
|
| 64 |
-
|
| 65 |
|
| 66 |
# This creates:
|
| 67 |
# - etc/citation-2.18
|
| 68 |
# - etc/description-2.18
|
| 69 |
```
|
| 70 |
|
| 71 |
-
**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 72 |
|
| 73 |
### 5. Fetch Language Codes and Flags
|
| 74 |
|
| 75 |
```bash
|
| 76 |
-
# Fetch
|
| 77 |
-
./00_fetch_ud_codes_and_flags.sh
|
| 78 |
|
| 79 |
# This creates:
|
| 80 |
# - etc/codes_and_flags-2.18.yaml
|
| 81 |
-
# - etc/codes_and_flags-latest.yaml (symlink)
|
| 82 |
```
|
| 83 |
|
| 84 |
-
**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 85 |
|
| 86 |
-
### 6.
|
| 87 |
|
| 88 |
```bash
|
| 89 |
-
#
|
| 90 |
./01_fetch_ud_repos.sh
|
| 91 |
|
| 92 |
# This creates:
|
| 93 |
-
# - .UD_submodules_add.commands
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 94 |
|
| 95 |
-
|
| 96 |
-
# Follow those instructions:
|
| 97 |
cd UD_repos
|
| 98 |
|
| 99 |
-
# Initialize git if
|
| 100 |
git init
|
| 101 |
|
| 102 |
-
# Add
|
| 103 |
bash ../.UD_submodules_add.commands
|
| 104 |
|
| 105 |
-
# Checkout the new release tag
|
| 106 |
-
git submodule foreach
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 107 |
|
| 108 |
cd ..
|
| 109 |
```
|
| 110 |
|
| 111 |
**Expected time:** 30-60 minutes depending on network speed.
|
| 112 |
|
| 113 |
-
|
| 114 |
|
|
|
|
| 115 |
```bash
|
| 116 |
-
|
| 117 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 118 |
|
| 119 |
# This creates:
|
| 120 |
-
# - metadata-2.18.json (contains info for all
|
| 121 |
|
| 122 |
# Verify the output
|
| 123 |
-
ls -lh metadata
|
| 124 |
# Should be ~200-300 KB
|
| 125 |
|
| 126 |
# Quick check: count treebanks
|
| 127 |
-
python -c "import json; print(len(json.load(open('metadata
|
| 128 |
-
# Should be 339+ (
|
| 129 |
```
|
| 130 |
|
| 131 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 132 |
|
| 133 |
```bash
|
| 134 |
-
# Generate
|
| 135 |
-
|
| 136 |
|
| 137 |
# This creates:
|
| 138 |
-
# -
|
| 139 |
-
# - README-2.18 (dataset card)
|
| 140 |
|
| 141 |
-
# Verify
|
| 142 |
-
ls -lh
|
| 143 |
```
|
| 144 |
|
| 145 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 146 |
|
| 147 |
```bash
|
| 148 |
-
#
|
| 149 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 150 |
|
| 151 |
-
|
| 152 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 153 |
|
| 154 |
# This creates:
|
| 155 |
-
# - parquet/{treebank_name}/{split}.parquet for all
|
| 156 |
|
| 157 |
# Verify output
|
| 158 |
du -sh ../parquet/
|
| 159 |
# Should be ~50-80 GB total
|
| 160 |
```
|
| 161 |
|
| 162 |
-
**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 163 |
```bash
|
| 164 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 165 |
```
|
| 166 |
|
| 167 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 168 |
|
| 169 |
```bash
|
| 170 |
-
# Copy generated files to root
|
| 171 |
cd .. # Back to repository root
|
| 172 |
|
| 173 |
-
|
| 174 |
-
cp tools/README
|
| 175 |
-
cp tools/metadata
|
| 176 |
|
| 177 |
# Verify files are in place
|
| 178 |
-
ls -lh
|
| 179 |
```
|
| 180 |
|
| 181 |
-
|
| 182 |
|
| 183 |
-
|
| 184 |
-
# Test loading with Python script (for backwards compatibility testing)
|
| 185 |
-
python -c "
|
| 186 |
-
from datasets import load_dataset
|
| 187 |
-
import sys
|
| 188 |
-
sys.path.insert(0, '.')
|
| 189 |
|
| 190 |
-
|
| 191 |
-
ds = load_dataset('./universal_dependencies.py', 'en_pronouns', split='test')
|
| 192 |
-
print(f'Loaded {len(ds)} examples')
|
| 193 |
-
print(f'Features: {list(ds.features.keys())}')
|
| 194 |
-
print(f'MWT field present: {\"mwt\" in ds.features}')
|
| 195 |
-
"
|
| 196 |
|
|
|
|
| 197 |
# Test loading from Parquet
|
| 198 |
python -c "
|
| 199 |
from datasets import load_dataset
|
| 200 |
|
| 201 |
-
# Test
|
| 202 |
-
ds = load_dataset('parquet', data_files='parquet/
|
| 203 |
-
print(f'Loaded {len(ds[\"train\"])} examples
|
|
|
|
|
|
|
| 204 |
"
|
| 205 |
```
|
| 206 |
|
| 207 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 208 |
|
| 209 |
```bash
|
| 210 |
# Add generated files
|
| 211 |
-
git add
|
| 212 |
-
git add
|
| 213 |
-
git add
|
| 214 |
-
git add tools/
|
| 215 |
-
git add tools/
|
| 216 |
-
git add tools/
|
| 217 |
-
git add tools/etc/citation-2.18
|
| 218 |
-
git add tools/etc/description-2.18
|
| 219 |
-
git add tools/etc/codes_and_flags-2.18.yaml
|
| 220 |
git add tools/.env
|
| 221 |
|
| 222 |
# Commit with descriptive message
|
| 223 |
-
git commit -m "Add UD
|
| 224 |
|
| 225 |
-
- Generated from Universal Dependencies
|
| 226 |
- 339+ treebanks across 186+ languages
|
| 227 |
-
-
|
| 228 |
-
-
|
| 229 |
-
-
|
| 230 |
|
| 231 |
Generated files:
|
| 232 |
-
- universal_dependencies.py (loader script)
|
| 233 |
- README.md (dataset card)
|
| 234 |
- metadata.json (treebank metadata)
|
| 235 |
-
-
|
| 236 |
|
| 237 |
# Tag the commit
|
| 238 |
-
git tag -a
|
| 239 |
|
| 240 |
# Push branch and tags
|
| 241 |
-
git push origin
|
| 242 |
git push origin --tags
|
| 243 |
```
|
| 244 |
|
| 245 |
-
###
|
| 246 |
|
| 247 |
-
|
| 248 |
-
|
| 249 |
-
|
| 250 |
|
| 251 |
-
|
| 252 |
-
|
|
|
|
| 253 |
|
| 254 |
-
#
|
| 255 |
-
|
| 256 |
-
|
| 257 |
-
# - MWT information is preserved
|
| 258 |
-
# - Token sequences exclude MWT forms (v2.0 bug fix)
|
| 259 |
```
|
| 260 |
|
| 261 |
-
|
| 262 |
|
| 263 |
```bash
|
| 264 |
-
# Upload
|
| 265 |
-
|
| 266 |
-
git add parquet/
|
| 267 |
-
git commit -m "Add Parquet files for all treebanks"
|
| 268 |
-
git push origin 2.18
|
| 269 |
-
|
| 270 |
-
# Alternative: Use huggingface-cli (if not using git-lfs)
|
| 271 |
-
huggingface-cli upload commul/universal_dependencies ./parquet --repo-type dataset --revision 2.18
|
| 272 |
-
|
| 273 |
-
# Upload main files
|
| 274 |
-
huggingface-cli upload commul/universal_dependencies ./universal_dependencies.py --repo-type dataset --revision 2.18
|
| 275 |
-
huggingface-cli upload commul/universal_dependencies ./README.md --repo-type dataset --revision 2.18
|
| 276 |
-
huggingface-cli upload commul/universal_dependencies ./metadata.json --repo-type dataset --revision 2.18
|
| 277 |
-
|
| 278 |
-
# Alternatively, push everything at once (if you have the repo cloned with git-lfs)
|
| 279 |
-
# git push hf 2.18
|
| 280 |
```
|
| 281 |
|
| 282 |
**Expected upload time:** 2-6 hours depending on network speed and HuggingFace server load.
|
| 283 |
|
| 284 |
-
|
|
|
|
|
|
|
| 285 |
|
| 286 |
Visit: https://huggingface.co/datasets/commul/universal_dependencies
|
| 287 |
|
| 288 |
-
|
| 289 |
-
2.
|
| 290 |
-
|
| 291 |
-
- `README.md`
|
| 292 |
- `metadata.json`
|
| 293 |
- `parquet/` directory with subdirectories
|
|
|
|
|
|
|
| 294 |
|
| 295 |
-
|
| 296 |
-
|
| 297 |
-
|
| 298 |
|
| 299 |
-
|
| 300 |
-
|
| 301 |
-
|
| 302 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 303 |
|
| 304 |
-
|
| 305 |
|
| 306 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 307 |
|
| 308 |
-
|
| 309 |
-
2. Update version badges if any
|
| 310 |
-
3. Commit and push
|
| 311 |
|
| 312 |
## Troubleshooting
|
| 313 |
|
|
@@ -321,15 +418,25 @@ cd tools/UD_repos
|
|
| 321 |
git submodule foreach 'git fetch --tags && (git checkout r2.18 || git checkout main) && touch .tag-r2.18'
|
| 322 |
```
|
| 323 |
|
|
|
|
|
|
|
| 324 |
### Issue: Metadata extraction fails for a treebank
|
| 325 |
|
| 326 |
**Problem:** A treebank is malformed or missing expected files.
|
| 327 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 328 |
**Solution:**
|
| 329 |
-
|
| 330 |
-
|
| 331 |
-
|
| 332 |
-
|
|
|
|
| 333 |
|
| 334 |
### Issue: Parquet generation fails for a treebank
|
| 335 |
|
|
@@ -337,69 +444,103 @@ git submodule foreach 'git fetch --tags && (git checkout r2.18 || git checkout m
|
|
| 337 |
|
| 338 |
**Solution:**
|
| 339 |
```bash
|
| 340 |
-
#
|
| 341 |
-
|
| 342 |
-
|
| 343 |
-
# Check
|
| 344 |
-
#
|
|
|
|
|
|
|
|
|
|
| 345 |
```
|
| 346 |
|
|
|
|
|
|
|
| 347 |
### Issue: HuggingFace upload is very slow
|
| 348 |
|
| 349 |
**Problem:** Large Parquet files + network latency.
|
| 350 |
|
| 351 |
**Solution:**
|
| 352 |
- Use a machine with better network connection
|
| 353 |
-
- Upload during off-peak hours
|
| 354 |
-
-
|
| 355 |
-
|
| 356 |
-
|
| 357 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 358 |
|
| 359 |
## Checklist
|
| 360 |
|
| 361 |
Before marking the release as complete:
|
| 362 |
|
| 363 |
-
- [ ]
|
| 364 |
-
- [ ]
|
| 365 |
-
- [ ]
|
| 366 |
-
- [ ]
|
| 367 |
-
- [ ]
|
| 368 |
-
- [ ]
|
| 369 |
-
- [ ]
|
| 370 |
-
- [ ]
|
| 371 |
-
- [ ]
|
|
|
|
|
|
|
| 372 |
- [ ] Pushed to origin
|
| 373 |
- [ ] Uploaded to HuggingFace Hub
|
| 374 |
-
- [ ] Verified
|
| 375 |
-
- [ ]
|
| 376 |
|
| 377 |
## Timeline Estimate
|
| 378 |
|
| 379 |
-
| Step | Time |
|
| 380 |
-
|
| 381 |
-
| 1-5: Setup & metadata |
|
| 382 |
-
| 6: Fetch repositories | 30-60 min |
|
| 383 |
-
|
|
| 384 |
-
|
|
| 385 |
-
|
|
| 386 |
-
|
|
| 387 |
-
|
|
| 388 |
-
|
|
| 389 |
-
| **Total** | **~5-11 hours** | |
|
| 390 |
|
| 391 |
-
**Recommendation:** Start the process in the morning
|
| 392 |
|
| 393 |
## Notes
|
| 394 |
|
| 395 |
-
-
|
| 396 |
-
-
|
| 397 |
-
-
|
| 398 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 399 |
|
| 400 |
## Support
|
| 401 |
|
| 402 |
For issues:
|
| 403 |
-
-
|
| 404 |
-
-
|
| 405 |
-
- HuggingFace Hub issues → [HuggingFace Community Forums](https://discuss.huggingface.co/)
|
|
|
|
| 2 |
|
| 3 |
This guide explains how to add a new Universal Dependencies release (e.g., UD 2.18, 2.19, etc.) to the `commul/universal_dependencies` HuggingFace dataset.
|
| 4 |
|
| 5 |
+
**Quick reference:** See [tools/README.md](tools/README.md) for concise commands and script documentation.
|
| 6 |
+
|
| 7 |
## Prerequisites
|
| 8 |
|
| 9 |
- Git repository cloned and up to date
|
| 10 |
+
- Python 3.12+ with `uv` installed
|
| 11 |
+
- Dependencies installed:
|
| 12 |
```bash
|
| 13 |
+
pip install ud-hf-parquet-tools pyyaml python-dotenv jinja2
|
| 14 |
```
|
| 15 |
- Access to push to `commul/universal_dependencies` on HuggingFace Hub
|
| 16 |
- `huggingface-cli` installed and authenticated:
|
|
|
|
| 21 |
|
| 22 |
## Overview
|
| 23 |
|
| 24 |
+
Each UD version (2.7, 2.8, ..., 2.17, 2.18, ...) has its own git branch. The dataset uses Parquet format (v2.0 architecture) for all versions. When a new UD release is published, you create a new branch and run the generation pipeline.
|
| 25 |
+
|
| 26 |
+
**Architecture:**
|
| 27 |
+
- **No Python script loader**: Dataset uses Parquet files only (datasets >=4.0.0)
|
| 28 |
+
- **External tools**: Helper functions in separate `ud-hf-parquet-tools` library
|
| 29 |
+
- **Blocked treebanks**: Some treebanks excluded due to license restrictions
|
| 30 |
|
| 31 |
## Step-by-Step Guide
|
| 32 |
|
|
|
|
| 39 |
### 2. Create New Branch
|
| 40 |
|
| 41 |
```bash
|
| 42 |
+
# Ensure you're on the latest main branch
|
| 43 |
git checkout main
|
| 44 |
git pull origin main
|
| 45 |
|
|
|
|
| 52 |
git checkout -b 2.18
|
| 53 |
```
|
| 54 |
|
| 55 |
+
**Why branching?** Each UD version is maintained independently, allowing users to load specific versions via `revision="2.18"`.
|
| 56 |
+
|
| 57 |
### 3. Update Environment Configuration
|
| 58 |
|
| 59 |
```bash
|
| 60 |
cd tools
|
| 61 |
|
| 62 |
+
# Set the version number
|
| 63 |
+
export NEW_VER=2.18
|
| 64 |
+
echo "UD_VER=${NEW_VER}" > .env
|
| 65 |
|
| 66 |
# Verify
|
| 67 |
cat .env
|
| 68 |
# Output: UD_VER=2.18
|
| 69 |
```
|
| 70 |
|
| 71 |
+
**What is .env?** Environment file that all scripts read to determine which UD version to process.
|
| 72 |
+
|
| 73 |
+
### 4. Fetch Metadata from LINDAT/CLARIN
|
| 74 |
|
| 75 |
```bash
|
| 76 |
+
# Fetch citation and description
|
| 77 |
+
./00_fetch_ud_clarin-dspace_metadata.py -o
|
| 78 |
|
| 79 |
# This creates:
|
| 80 |
# - etc/citation-2.18
|
| 81 |
# - etc/description-2.18
|
| 82 |
```
|
| 83 |
|
| 84 |
+
**Before running**, update the script to add the new version's handle ID:
|
| 85 |
+
|
| 86 |
+
1. Open `00_fetch_ud_clarin-dspace_metadata.py`
|
| 87 |
+
2. Find the `url_postfixes` dictionary
|
| 88 |
+
3. Add entry for new version:
|
| 89 |
+
```python
|
| 90 |
+
"2.18": "11234/1-XXXX", # Check UD website for correct handle
|
| 91 |
+
```
|
| 92 |
+
|
| 93 |
+
**Where to find handle?** Visit the [UD release page](https://universaldependencies.org/) and check the LINDAT citation link.
|
| 94 |
|
| 95 |
### 5. Fetch Language Codes and Flags
|
| 96 |
|
| 97 |
```bash
|
| 98 |
+
# Fetch language metadata
|
| 99 |
+
./00_fetch_ud_codes_and_flags.sh -o
|
| 100 |
|
| 101 |
# This creates:
|
| 102 |
# - etc/codes_and_flags-2.18.yaml
|
| 103 |
+
# - etc/codes_and_flags-latest.yaml (updated symlink)
|
| 104 |
```
|
| 105 |
|
| 106 |
+
**Before running**, update the script with the docs-automation commit hash:
|
| 107 |
+
|
| 108 |
+
1. Open `00_fetch_ud_codes_and_flags.sh`
|
| 109 |
+
2. Find the `VER_MAPPING` associative array
|
| 110 |
+
3. Add entry:
|
| 111 |
+
```bash
|
| 112 |
+
VER_MAPPING["2.18"]="<git-commit-hash-for-2.18>"
|
| 113 |
+
```
|
| 114 |
+
|
| 115 |
+
**How to find hash?** Check the [UD docs-automation releases](https://github.com/UniversalDependencies/docs/releases) for the commit tagged with the version.
|
| 116 |
|
| 117 |
+
### 6. Discover UD Repositories
|
| 118 |
|
| 119 |
```bash
|
| 120 |
+
# Generate list of all UD repositories
|
| 121 |
./01_fetch_ud_repos.sh
|
| 122 |
|
| 123 |
# This creates:
|
| 124 |
+
# - .UD_submodules_add.commands (list of git submodule add commands)
|
| 125 |
+
```
|
| 126 |
+
|
| 127 |
+
**What does this do?** Queries GitHub API for all repositories in the UniversalDependencies organization and generates commands to add them as submodules.
|
| 128 |
+
|
| 129 |
+
### 7. Fetch UD Repositories as Submodules
|
| 130 |
|
| 131 |
+
```bash
|
|
|
|
| 132 |
cd UD_repos
|
| 133 |
|
| 134 |
+
# Initialize git repository (if first time)
|
| 135 |
git init
|
| 136 |
|
| 137 |
+
# Add all UD repositories as submodules
|
| 138 |
bash ../.UD_submodules_add.commands
|
| 139 |
|
| 140 |
+
# Checkout the new release tag in all submodules
|
| 141 |
+
git submodule foreach "git fetch --tags && git checkout r${NEW_VER} && touch .tag-r${NEW_VER}"
|
| 142 |
+
|
| 143 |
+
# Create branch and commit
|
| 144 |
+
git checkout -b ${NEW_VER}
|
| 145 |
+
git add -A
|
| 146 |
+
git commit -m "Add UD ${NEW_VER} repositories"
|
| 147 |
|
| 148 |
cd ..
|
| 149 |
```
|
| 150 |
|
| 151 |
**Expected time:** 30-60 minutes depending on network speed.
|
| 152 |
|
| 153 |
+
**What are .tag-r{VER} files?** Marker files that `02_generate_metadata.py` checks to ensure a repository has the correct version tag.
|
| 154 |
|
| 155 |
+
**Troubleshooting:** If some repositories don't have the tag:
|
| 156 |
```bash
|
| 157 |
+
git submodule foreach "git fetch --tags && (git checkout r${NEW_VER} || git checkout main) && touch .tag-r${NEW_VER}"
|
| 158 |
+
```
|
| 159 |
+
|
| 160 |
+
### 8. Extract Metadata from Treebanks
|
| 161 |
+
|
| 162 |
+
```bash
|
| 163 |
+
# Generate metadata from all treebank directories
|
| 164 |
+
./02_generate_metadata.py -o
|
| 165 |
|
| 166 |
# This creates:
|
| 167 |
+
# - metadata-2.18.json (contains info for all treebanks)
|
| 168 |
|
| 169 |
# Verify the output
|
| 170 |
+
ls -lh metadata-${NEW_VER}.json
|
| 171 |
# Should be ~200-300 KB
|
| 172 |
|
| 173 |
# Quick check: count treebanks
|
| 174 |
+
python -c "import json; print(len(json.load(open('metadata-${NEW_VER}.json'))))"
|
| 175 |
+
# Should be 339+ (number increases with new treebanks)
|
| 176 |
```
|
| 177 |
|
| 178 |
+
**What does this script do?**
|
| 179 |
+
- Reads README files from each treebank
|
| 180 |
+
- Extracts summaries, licenses, genres
|
| 181 |
+
- Collects statistics from stats.xml
|
| 182 |
+
- Identifies available splits (train/dev/test)
|
| 183 |
+
- Checks `blocked_treebanks.yaml` for license restrictions
|
| 184 |
+
- Adds "blocked" property to metadata
|
| 185 |
+
|
| 186 |
+
**Expected time:** 5-10 minutes
|
| 187 |
+
|
| 188 |
+
### 9. Generate Dataset Card (README)
|
| 189 |
|
| 190 |
```bash
|
| 191 |
+
# Generate HuggingFace dataset card
|
| 192 |
+
./03_generate_README.py -o
|
| 193 |
|
| 194 |
# This creates:
|
| 195 |
+
# - README-2.18 (dataset card for HuggingFace)
|
|
|
|
| 196 |
|
| 197 |
+
# Verify file was created
|
| 198 |
+
ls -lh README-${NEW_VER}
|
| 199 |
```
|
| 200 |
|
| 201 |
+
**What does this do?** Renders `templates/README.tmpl` with metadata, citation, and description to create the HuggingFace dataset card.
|
| 202 |
+
|
| 203 |
+
### 10. Review Blocked Treebanks
|
| 204 |
+
|
| 205 |
+
Before generating Parquet files, review the blocked treebanks:
|
| 206 |
|
| 207 |
```bash
|
| 208 |
+
# Check blocked treebanks list
|
| 209 |
+
cat blocked_treebanks.yaml
|
| 210 |
+
|
| 211 |
+
# Example entry:
|
| 212 |
+
# pt_cintil:
|
| 213 |
+
# reason: "Restrictive license prohibits redistribution in derived formats"
|
| 214 |
+
# license: "CC BY-NC-SA 4.0"
|
| 215 |
+
```
|
| 216 |
+
|
| 217 |
+
**Why block treebanks?** Some treebanks have licenses (e.g., CC BY-NC-SA) that prohibit redistribution in modified formats like Parquet.
|
| 218 |
+
|
| 219 |
+
**See also:** [tools/BLOCKED_TREEBANKS.md](tools/BLOCKED_TREEBANKS.md)
|
| 220 |
|
| 221 |
+
### 11. Generate Parquet Files
|
| 222 |
+
|
| 223 |
+
```bash
|
| 224 |
+
# Test with 3 treebanks first
|
| 225 |
+
uv run ./04_generate_parquet.py --test
|
| 226 |
+
|
| 227 |
+
# If successful, generate all treebanks (takes 2-4 hours)
|
| 228 |
+
uv run ./04_generate_parquet.py
|
| 229 |
|
| 230 |
# This creates:
|
| 231 |
+
# - ../parquet/{treebank_name}/{split}.parquet for all treebanks
|
| 232 |
|
| 233 |
# Verify output
|
| 234 |
du -sh ../parquet/
|
| 235 |
# Should be ~50-80 GB total
|
| 236 |
```
|
| 237 |
|
| 238 |
+
**What does this do?** Wrapper script that calls the `ud-hf-parquet-tools` library to convert CoNLL-U files to Parquet format.
|
| 239 |
+
|
| 240 |
+
**Options:**
|
| 241 |
+
- `--test`: Generate only 3 treebanks (quick test)
|
| 242 |
+
- `--overwrite`: Regenerate existing files
|
| 243 |
+
- `--blocked-treebanks`: Path to YAML file with blocked treebanks
|
| 244 |
+
|
| 245 |
+
**Expected time:** 2-4 hours for all treebanks
|
| 246 |
+
|
| 247 |
+
### 12. Validate Parquet Files
|
| 248 |
+
|
| 249 |
```bash
|
| 250 |
+
# Test validation on 3 treebanks
|
| 251 |
+
uv run ./05_validate_parquet.py --local --test
|
| 252 |
+
|
| 253 |
+
# Full validation (optional, takes ~30-60 minutes)
|
| 254 |
+
uv run ./05_validate_parquet.py --local --mode text -vv > /tmp/parquet-check.log
|
| 255 |
+
|
| 256 |
+
# Check for errors (excluding metadata comments)
|
| 257 |
+
grep -E " [+-]" /tmp/parquet-check.log | grep -vE " [+-]#"
|
| 258 |
```
|
| 259 |
|
| 260 |
+
**What does this do?** Compares Parquet output to original CoNLL-U to verify 100% data fidelity.
|
| 261 |
+
|
| 262 |
+
**Expected output:** No differences except in comment metadata (which may vary slightly).
|
| 263 |
+
|
| 264 |
+
### 13. Copy Files to Repository Root
|
| 265 |
|
| 266 |
```bash
|
|
|
|
| 267 |
cd .. # Back to repository root
|
| 268 |
|
| 269 |
+
# Copy generated files
|
| 270 |
+
cp tools/README-${NEW_VER} README.md
|
| 271 |
+
cp tools/metadata-${NEW_VER}.json metadata.json
|
| 272 |
|
| 273 |
# Verify files are in place
|
| 274 |
+
ls -lh README.md metadata.json parquet/
|
| 275 |
```
|
| 276 |
|
| 277 |
+
**Why copy to root?** HuggingFace Hub expects these files at the repository root for the dataset to work.
|
| 278 |
|
| 279 |
+
### 14. Test Dataset Loading
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 280 |
|
| 281 |
+
Test that the dataset loads correctly:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 282 |
|
| 283 |
+
```bash
|
| 284 |
# Test loading from Parquet
|
| 285 |
python -c "
|
| 286 |
from datasets import load_dataset
|
| 287 |
|
| 288 |
+
# Test a small treebank
|
| 289 |
+
ds = load_dataset('parquet', data_files='parquet/en_pronouns/test.parquet')
|
| 290 |
+
print(f'Loaded {len(ds[\"train\"])} examples')
|
| 291 |
+
print(f'Features: {list(ds[\"train\"].features.keys())}')
|
| 292 |
+
print(f'MWT field present: {\"mwt\" in ds[\"train\"].features}')
|
| 293 |
"
|
| 294 |
```
|
| 295 |
|
| 296 |
+
**Expected output:**
|
| 297 |
+
```
|
| 298 |
+
Loaded X examples
|
| 299 |
+
Features: ['sent_id', 'text', 'comments', 'tokens', 'lemmas', 'upos', 'xpos', 'feats', 'head', 'deprel', 'deps', 'misc', 'mwt', 'empty_nodes']
|
| 300 |
+
MWT field present: True
|
| 301 |
+
```
|
| 302 |
+
|
| 303 |
+
### 15. Commit Changes to Git
|
| 304 |
|
| 305 |
```bash
|
| 306 |
# Add generated files
|
| 307 |
+
git add README.md metadata.json parquet/
|
| 308 |
+
git add tools/metadata-${NEW_VER}.json
|
| 309 |
+
git add tools/README-${NEW_VER}
|
| 310 |
+
git add tools/etc/citation-${NEW_VER}
|
| 311 |
+
git add tools/etc/description-${NEW_VER}
|
| 312 |
+
git add tools/etc/codes_and_flags-${NEW_VER}.yaml
|
|
|
|
|
|
|
|
|
|
| 313 |
git add tools/.env
|
| 314 |
|
| 315 |
# Commit with descriptive message
|
| 316 |
+
git commit -m "Add UD ${NEW_VER} data with Parquet format
|
| 317 |
|
| 318 |
+
- Generated from Universal Dependencies ${NEW_VER} release
|
| 319 |
- 339+ treebanks across 186+ languages
|
| 320 |
+
- Parquet format for efficient loading (datasets >=4.0.0)
|
| 321 |
+
- Blocked treebanks excluded per license restrictions
|
| 322 |
+
- Helper functions available in ud-hf-parquet-tools library
|
| 323 |
|
| 324 |
Generated files:
|
|
|
|
| 325 |
- README.md (dataset card)
|
| 326 |
- metadata.json (treebank metadata)
|
| 327 |
+
- parquet/ directory with all treebank splits"
|
| 328 |
|
| 329 |
# Tag the commit
|
| 330 |
+
git tag -a ud${NEW_VER} -m "Universal Dependencies ${NEW_VER} release"
|
| 331 |
|
| 332 |
# Push branch and tags
|
| 333 |
+
git push origin ${NEW_VER}
|
| 334 |
git push origin --tags
|
| 335 |
```
|
| 336 |
|
| 337 |
+
### 16. Upload to HuggingFace Hub
|
| 338 |
|
| 339 |
+
**Option A: Using git-lfs (Recommended)**
|
| 340 |
+
|
| 341 |
+
If you've cloned the HuggingFace repository with git-lfs:
|
| 342 |
|
| 343 |
+
```bash
|
| 344 |
+
# Add HF Hub as remote (if not already)
|
| 345 |
+
git remote add hf https://huggingface.co/datasets/commul/universal_dependencies
|
| 346 |
|
| 347 |
+
# Push to HuggingFace
|
| 348 |
+
git push hf ${NEW_VER}
|
| 349 |
+
git push hf --tags
|
|
|
|
|
|
|
| 350 |
```
|
| 351 |
|
| 352 |
+
**Option B: Using huggingface-cli**
|
| 353 |
|
| 354 |
```bash
|
| 355 |
+
# Upload entire directory
|
| 356 |
+
huggingface-cli upload commul/universal_dependencies . --repo-type dataset --revision ${NEW_VER}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 357 |
```
|
| 358 |
|
| 359 |
**Expected upload time:** 2-6 hours depending on network speed and HuggingFace server load.
|
| 360 |
|
| 361 |
+
**Tip:** Run uploads during off-peak hours for better performance.
|
| 362 |
+
|
| 363 |
+
### 17. Verify on HuggingFace Hub
|
| 364 |
|
| 365 |
Visit: https://huggingface.co/datasets/commul/universal_dependencies
|
| 366 |
|
| 367 |
+
**Checklist:**
|
| 368 |
+
1. ✅ Branch `2.18` exists in the "Branches" dropdown
|
| 369 |
+
2. ✅ Files are present:
|
| 370 |
+
- `README.md` (dataset card)
|
| 371 |
- `metadata.json`
|
| 372 |
- `parquet/` directory with subdirectories
|
| 373 |
+
3. ✅ Dataset card displays correctly
|
| 374 |
+
4. ✅ Files section shows parquet files
|
| 375 |
|
| 376 |
+
**Test loading:**
|
| 377 |
+
```python
|
| 378 |
+
from datasets import load_dataset
|
| 379 |
|
| 380 |
+
# Load from new version
|
| 381 |
+
ds = load_dataset("commul/universal_dependencies", "en_ewt", revision="2.18")
|
| 382 |
+
print(ds)
|
| 383 |
+
```
|
| 384 |
+
|
| 385 |
+
**Expected output:**
|
| 386 |
+
```
|
| 387 |
+
DatasetDict({
|
| 388 |
+
train: Dataset({
|
| 389 |
+
features: ['sent_id', 'text', 'comments', 'tokens', 'lemmas', ...],
|
| 390 |
+
num_rows: 12544
|
| 391 |
+
})
|
| 392 |
+
dev: Dataset({...})
|
| 393 |
+
test: Dataset({...})
|
| 394 |
+
})
|
| 395 |
+
```
|
| 396 |
+
|
| 397 |
+
### 18. Update Main Branch (Optional)
|
| 398 |
|
| 399 |
+
If this is now the latest version:
|
| 400 |
|
| 401 |
+
```bash
|
| 402 |
+
git checkout main
|
| 403 |
+
git merge ${NEW_VER}
|
| 404 |
+
git push origin main
|
| 405 |
+
```
|
| 406 |
|
| 407 |
+
This makes the new version the default when users don't specify a revision.
|
|
|
|
|
|
|
| 408 |
|
| 409 |
## Troubleshooting
|
| 410 |
|
|
|
|
| 418 |
git submodule foreach 'git fetch --tags && (git checkout r2.18 || git checkout main) && touch .tag-r2.18'
|
| 419 |
```
|
| 420 |
|
| 421 |
+
This falls back to `main` branch for repositories without the tag.
|
| 422 |
+
|
| 423 |
### Issue: Metadata extraction fails for a treebank
|
| 424 |
|
| 425 |
**Problem:** A treebank is malformed or missing expected files.
|
| 426 |
|
| 427 |
+
**Symptoms:**
|
| 428 |
+
```
|
| 429 |
+
ITEM DELETED - no summary: UD_Language-Treebank
|
| 430 |
+
ITEM DELETED - no files : UD_Language-Treebank
|
| 431 |
+
ITEM DELETED - no license: UD_Language-Treebank
|
| 432 |
+
```
|
| 433 |
+
|
| 434 |
**Solution:**
|
| 435 |
+
1. Check the specific treebank in `tools/UD_repos/UD_{Language}-{Treebank}/`
|
| 436 |
+
2. Verify it has `.conllu` files and `stats.xml`
|
| 437 |
+
3. Check if README has required metadata
|
| 438 |
+
4. If persistently broken, report to UD project
|
| 439 |
+
5. Treebank will be automatically excluded from dataset
|
| 440 |
|
| 441 |
### Issue: Parquet generation fails for a treebank
|
| 442 |
|
|
|
|
| 444 |
|
| 445 |
**Solution:**
|
| 446 |
```bash
|
| 447 |
+
# Isolate the problem by generating one treebank at a time
|
| 448 |
+
uv run ./04_generate_parquet.py --treebanks "en_ewt"
|
| 449 |
+
|
| 450 |
+
# Check error message for details
|
| 451 |
+
# Common issues:
|
| 452 |
+
# - Malformed CoNLL-U syntax
|
| 453 |
+
# - Encoding problems
|
| 454 |
+
# - Invalid character in fields
|
| 455 |
```
|
| 456 |
|
| 457 |
+
**Report issues:** See [CONLLU_PARSING.md](https://github.com/bot-zen/ud-hf-parquet-tools/blob/main/CONLLU_PARSING.md) in ud-hf-parquet-tools for known parsing edge cases.
|
| 458 |
+
|
| 459 |
### Issue: HuggingFace upload is very slow
|
| 460 |
|
| 461 |
**Problem:** Large Parquet files + network latency.
|
| 462 |
|
| 463 |
**Solution:**
|
| 464 |
- Use a machine with better network connection
|
| 465 |
+
- Upload during off-peak hours (e.g., nighttime UTC)
|
| 466 |
+
- Consider parallel uploads if using huggingface-cli
|
| 467 |
+
|
| 468 |
+
### Issue: Out of disk space
|
| 469 |
+
|
| 470 |
+
**Problem:** Parquet files take ~50-80 GB.
|
| 471 |
+
|
| 472 |
+
**Solution:**
|
| 473 |
+
- Ensure you have at least 100 GB free space
|
| 474 |
+
- Generate Parquet files on a machine with larger disk
|
| 475 |
+
- Clean up old UD versions: `rm -rf tools/UD_repos/` after uploading
|
| 476 |
+
|
| 477 |
+
### Issue: Script dependencies not found
|
| 478 |
+
|
| 479 |
+
**Problem:** ImportError or ModuleNotFoundError.
|
| 480 |
+
|
| 481 |
+
**Solution:**
|
| 482 |
+
```bash
|
| 483 |
+
# Install required packages
|
| 484 |
+
pip install ud-hf-parquet-tools pyyaml python-dotenv jinja2
|
| 485 |
+
|
| 486 |
+
# Or use uv to manage dependencies automatically
|
| 487 |
+
uv run --script ./script.py
|
| 488 |
+
```
|
| 489 |
|
| 490 |
## Checklist
|
| 491 |
|
| 492 |
Before marking the release as complete:
|
| 493 |
|
| 494 |
+
- [ ] `.env` file updated with new version
|
| 495 |
+
- [ ] Metadata files generated (`citation-{VER}`, `description-{VER}`, `codes_and_flags-{VER}.yaml`)
|
| 496 |
+
- [ ] All UD repositories fetched and checked out to correct tag
|
| 497 |
+
- [ ] `metadata-{VER}.json` generated with blocked treebank info
|
| 498 |
+
- [ ] `README-{VER}` generated
|
| 499 |
+
- [ ] Parquet files generated for all non-blocked treebanks
|
| 500 |
+
- [ ] Parquet files validated (spot check)
|
| 501 |
+
- [ ] Files copied to repository root (`README.md`, `metadata.json`, `parquet/`)
|
| 502 |
+
- [ ] Tested loading from Parquet files
|
| 503 |
+
- [ ] Committed to git with descriptive message
|
| 504 |
+
- [ ] Tagged with `ud{VER}`
|
| 505 |
- [ ] Pushed to origin
|
| 506 |
- [ ] Uploaded to HuggingFace Hub
|
| 507 |
+
- [ ] Verified dataset loads from HF Hub
|
| 508 |
+
- [ ] (Optional) Updated main branch if latest version
|
| 509 |
|
| 510 |
## Timeline Estimate
|
| 511 |
|
| 512 |
+
| Step | Time | Notes |
|
| 513 |
+
|------|------|-------|
|
| 514 |
+
| 1-5: Setup & metadata | 10-15 min | Manual edits required |
|
| 515 |
+
| 6-7: Fetch repositories | 30-60 min | Network-dependent |
|
| 516 |
+
| 8-9: Generate metadata/README | 5-10 min | Fast |
|
| 517 |
+
| 10-12: Generate & validate Parquet | 2-4 hours | CPU-intensive |
|
| 518 |
+
| 13-15: Commit to git | 10-15 min | Fast |
|
| 519 |
+
| 16: Upload to HF Hub | 2-6 hours | Network-dependent |
|
| 520 |
+
| 17-18: Verify & update | 10-20 min | Fast |
|
| 521 |
+
| **Total** | **~5-11 hours** | Can parallelize some steps |
|
|
|
|
| 522 |
|
| 523 |
+
**Recommendation:** Start the process in the morning. Long-running steps (repository fetch, Parquet generation, upload) can run unattended.
|
| 524 |
|
| 525 |
## Notes
|
| 526 |
|
| 527 |
+
- **No Python script loader:** v2.0 architecture uses Parquet files only (no `universal_dependencies.py`)
|
| 528 |
+
- **Helper functions external:** CoNLL-U utilities available in `ud-hf-parquet-tools` library
|
| 529 |
+
- **Blocked treebanks:** Some treebanks excluded due to license restrictions (see `blocked_treebanks.yaml`)
|
| 530 |
+
- **Branch independence:** Each UD version branch is self-contained
|
| 531 |
+
- **Version pinning:** Users can load specific versions via `revision="2.18"`
|
| 532 |
+
|
| 533 |
+
## Reference Documentation
|
| 534 |
+
|
| 535 |
+
- **Quick reference:** [tools/README.md](tools/README.md) - Script documentation and common operations
|
| 536 |
+
- **Blocked treebanks:** [tools/BLOCKED_TREEBANKS.md](tools/BLOCKED_TREEBANKS.md) - License restrictions
|
| 537 |
+
- **Migration guide:** [MIGRATION.md](MIGRATION.md) - v1.x to v2.0 migration
|
| 538 |
+
- **Parquet tools:** https://github.com/bot-zen/ud-hf-parquet-tools - External library for CoNLL-U processing
|
| 539 |
+
- **CoNLL-U parsing:** https://github.com/bot-zen/ud-hf-parquet-tools/blob/main/CONLLU_PARSING.md - Parsing edge cases
|
| 540 |
|
| 541 |
## Support
|
| 542 |
|
| 543 |
For issues:
|
| 544 |
+
- **UD data issues** → [UD GitHub Issues](https://github.com/UniversalDependencies/docs/issues)
|
| 545 |
+
- **Tooling issues** → Your repository issues or [ud-hf-parquet-tools issues](https://github.com/bot-zen/ud-hf-parquet-tools/issues)
|
| 546 |
+
- **HuggingFace Hub issues** → [HuggingFace Community Forums](https://discuss.huggingface.co/)
|
|
@@ -1,129 +1,185 @@
|
|
| 1 |
-
#
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
|
| 48 |
-
|
| 49 |
-
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
| 55 |
-
|
| 56 |
-
|
| 57 |
-
|
| 58 |
-
|
| 59 |
-
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
|
| 63 |
-
|
| 64 |
-
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
|
| 68 |
-
|
| 69 |
-
|
| 70 |
-
|
| 71 |
-
|
| 72 |
-
|
| 73 |
-
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
|
| 78 |
-
|
| 79 |
-
|
| 80 |
-
|
| 81 |
-
|
| 82 |
-
|
| 83 |
-
|
| 84 |
-
|
| 85 |
-
|
| 86 |
-
|
| 87 |
-
|
| 88 |
-
|
| 89 |
-
|
| 90 |
-
|
| 91 |
-
|
| 92 |
-
|
| 93 |
-
|
| 94 |
-
|
| 95 |
-
|
| 96 |
-
|
| 97 |
-
|
| 98 |
-
|
| 99 |
-
|
| 100 |
-
|
| 101 |
-
|
| 102 |
-
|
| 103 |
-
|
| 104 |
-
|
| 105 |
-
|
| 106 |
-
|
| 107 |
-
|
| 108 |
-
|
| 109 |
-
|
| 110 |
-
|
| 111 |
-
|
| 112 |
-
|
| 113 |
-
|
| 114 |
-
|
| 115 |
-
|
| 116 |
-
|
| 117 |
-
|
| 118 |
-
|
| 119 |
-
|
| 120 |
-
|
| 121 |
-
|
| 122 |
-
|
| 123 |
-
|
| 124 |
-
|
| 125 |
-
|
| 126 |
-
|
| 127 |
-
|
| 128 |
-
|
| 129 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# tools/ Directory
|
| 2 |
+
|
| 3 |
+
Quick reference for the Universal Dependencies dataset generation pipeline.
|
| 4 |
+
|
| 5 |
+
## Quick Start: Add New UD Version
|
| 6 |
+
|
| 7 |
+
```bash
|
| 8 |
+
# Set version
|
| 9 |
+
export NEW_VER=2.18
|
| 10 |
+
cd tools
|
| 11 |
+
echo "UD_VER=${NEW_VER}" > .env
|
| 12 |
+
|
| 13 |
+
# Fetch metadata
|
| 14 |
+
./00_fetch_ud_clarin-dspace_metadata.py -o
|
| 15 |
+
./00_fetch_ud_codes_and_flags.sh -o
|
| 16 |
+
./01_fetch_ud_repos.sh
|
| 17 |
+
|
| 18 |
+
# Initialize repositories
|
| 19 |
+
cd UD_repos
|
| 20 |
+
git init
|
| 21 |
+
bash ../.UD_submodules_add.commands
|
| 22 |
+
git submodule foreach "git fetch --tags && git checkout r${NEW_VER} && touch .tag-r${NEW_VER}"
|
| 23 |
+
cd ..
|
| 24 |
+
|
| 25 |
+
# Generate files
|
| 26 |
+
./02_generate_metadata.py -o
|
| 27 |
+
./03_generate_README.py -o
|
| 28 |
+
uv run ./04_generate_parquet.py
|
| 29 |
+
uv run ./05_validate_parquet.py --local --test
|
| 30 |
+
|
| 31 |
+
# Sync to root
|
| 32 |
+
cp README-${NEW_VER} ../README.md
|
| 33 |
+
cp metadata-${NEW_VER}.json ../metadata.json
|
| 34 |
+
git add ../parquet ../README.md ../metadata.json
|
| 35 |
+
```
|
| 36 |
+
|
| 37 |
+
**For detailed instructions, see:** [ADDING_NEW_UD_VERSION.md](../ADDING_NEW_UD_VERSION.md)
|
| 38 |
+
|
| 39 |
+
## Pipeline Scripts
|
| 40 |
+
|
| 41 |
+
### 00 - Fetch Metadata
|
| 42 |
+
|
| 43 |
+
**00_fetch_ud_clarin-dspace_metadata.py**
|
| 44 |
+
- Downloads citation and description from LINDAT/CLARIN repository
|
| 45 |
+
- Creates: `etc/citation-{VER}`, `etc/description-{VER}`
|
| 46 |
+
- Update `url_postfixes` dict for new versions
|
| 47 |
+
|
| 48 |
+
**00_fetch_ud_codes_and_flags.sh**
|
| 49 |
+
- Downloads language codes and flags from UD docs-automation
|
| 50 |
+
- Creates: `etc/codes_and_flags-{VER}.yaml`, `etc/codes_and_flags-latest.yaml`
|
| 51 |
+
- Update `VER_MAPPING` for new versions
|
| 52 |
+
|
| 53 |
+
### 01 - Fetch Repositories
|
| 54 |
+
|
| 55 |
+
**01_fetch_ud_repos.sh**
|
| 56 |
+
- Discovers all UD repositories on GitHub
|
| 57 |
+
- Creates: `.UD_submodules_add.commands`
|
| 58 |
+
- Run commands in `UD_repos/` directory to fetch all treebanks
|
| 59 |
+
|
| 60 |
+
### 02 - Generate Metadata
|
| 61 |
+
|
| 62 |
+
**02_generate_metadata.py**
|
| 63 |
+
- Extracts metadata from local UD directories
|
| 64 |
+
- Collects: summaries, licenses, splits, statistics, blocked status
|
| 65 |
+
- Creates: `metadata-{VER}.json`
|
| 66 |
+
- Reads: `blocked_treebanks.yaml` for license restrictions
|
| 67 |
+
|
| 68 |
+
### 03 - Generate README
|
| 69 |
+
|
| 70 |
+
**03_generate_README.py**
|
| 71 |
+
- Renders Jinja2 template with metadata
|
| 72 |
+
- Creates: `README-{VER}` (HuggingFace dataset card)
|
| 73 |
+
- Uses: `templates/README.tmpl`, `metadata-{VER}.json`, `etc/citation-{VER}`, `etc/description-{VER}`
|
| 74 |
+
|
| 75 |
+
### 04 - Generate Parquet
|
| 76 |
+
|
| 77 |
+
**04_generate_parquet.py**
|
| 78 |
+
- Wrapper script calling `ud-hf-parquet-tools` library
|
| 79 |
+
- Converts CoNLL-U files to Parquet format
|
| 80 |
+
- Creates: `../parquet/{treebank}/{split}.parquet`
|
| 81 |
+
- Options: `--test`, `--overwrite`, `--blocked-treebanks`
|
| 82 |
+
|
| 83 |
+
### 05 - Validate Parquet
|
| 84 |
+
|
| 85 |
+
**05_validate_parquet.py**
|
| 86 |
+
- Wrapper script calling `ud-hf-parquet-tools` library
|
| 87 |
+
- Validates Parquet files against CoNLL-U source
|
| 88 |
+
- Options: `--local`, `--test`, `--mode text`
|
| 89 |
+
|
| 90 |
+
## Directory Structure
|
| 91 |
+
|
| 92 |
+
```
|
| 93 |
+
tools/
|
| 94 |
+
├── 00_fetch_ud_clarin-dspace_metadata.py # Fetch citation/description
|
| 95 |
+
├── 00_fetch_ud_codes_and_flags.sh # Fetch language metadata
|
| 96 |
+
├── 01_fetch_ud_repos.sh # Discover UD repos
|
| 97 |
+
├── 02_generate_metadata.py # Extract treebank metadata
|
| 98 |
+
├── 03_generate_README.py # Generate dataset card
|
| 99 |
+
├── 04_generate_parquet.py # Generate Parquet files
|
| 100 |
+
├── 05_validate_parquet.py # Validate Parquet files
|
| 101 |
+
├── blocked_treebanks.yaml # License-restricted treebanks
|
| 102 |
+
├── etc/
|
| 103 |
+
│ ├── citation-{VER} # Generated citations
|
| 104 |
+
│ ├── description-{VER} # Generated descriptions
|
| 105 |
+
│ └── codes_and_flags-{VER}.yaml # Language metadata
|
| 106 |
+
├── templates/
|
| 107 |
+
│ └── README.tmpl # Jinja2 template for dataset card
|
| 108 |
+
├── metadata-{VER}.json # Generated metadata (output)
|
| 109 |
+
├── README-{VER} # Generated dataset card (output)
|
| 110 |
+
└── UD_repos/ # Git submodules (created by 01)
|
| 111 |
+
├── UD_{Language}-{Treebank}/ # Individual treebanks
|
| 112 |
+
└── .UD_submodules_add.commands # Generated submodule commands
|
| 113 |
+
```
|
| 114 |
+
|
| 115 |
+
## Configuration
|
| 116 |
+
|
| 117 |
+
**Environment Variables (.env)**
|
| 118 |
+
```bash
|
| 119 |
+
UD_VER=2.17 # Current UD version
|
| 120 |
+
```
|
| 121 |
+
|
| 122 |
+
**Blocked Treebanks (blocked_treebanks.yaml)**
|
| 123 |
+
```yaml
|
| 124 |
+
pt_cintil:
|
| 125 |
+
reason: "Restrictive license prohibits redistribution in derived formats"
|
| 126 |
+
license: "CC BY-NC-SA 4.0"
|
| 127 |
+
url: "https://github.com/UniversalDependencies/UD_Portuguese-CINTIL"
|
| 128 |
+
```
|
| 129 |
+
|
| 130 |
+
## Dependencies
|
| 131 |
+
|
| 132 |
+
- Python 3.12+
|
| 133 |
+
- `uv` (for running scripts with dependencies)
|
| 134 |
+
- `ud-hf-parquet-tools` (for parquet generation/validation)
|
| 135 |
+
- Git with submodules support
|
| 136 |
+
|
| 137 |
+
Install dependencies:
|
| 138 |
+
```bash
|
| 139 |
+
pip install ud-hf-parquet-tools pyyaml python-dotenv jinja2
|
| 140 |
+
```
|
| 141 |
+
|
| 142 |
+
## Common Operations
|
| 143 |
+
|
| 144 |
+
**Test on subset:**
|
| 145 |
+
```bash
|
| 146 |
+
uv run ./04_generate_parquet.py --test # Tests 3 treebanks
|
| 147 |
+
uv run ./05_validate_parquet.py --test # Validates 3 treebanks
|
| 148 |
+
```
|
| 149 |
+
|
| 150 |
+
**Force regeneration:**
|
| 151 |
+
```bash
|
| 152 |
+
./02_generate_metadata.py -o # Override existing
|
| 153 |
+
./03_generate_README.py -o
|
| 154 |
+
uv run ./04_generate_parquet.py --overwrite
|
| 155 |
+
```
|
| 156 |
+
|
| 157 |
+
**Skip blocked treebanks:**
|
| 158 |
+
```bash
|
| 159 |
+
uv run ./04_generate_parquet.py --blocked-treebanks blocked_treebanks.yaml
|
| 160 |
+
```
|
| 161 |
+
|
| 162 |
+
## Troubleshooting
|
| 163 |
+
|
| 164 |
+
**Script not found:**
|
| 165 |
+
- Scripts use `#!/usr/bin/env -S uv run --script` shebang
|
| 166 |
+
- Make executable: `chmod +x *.py *.sh`
|
| 167 |
+
- Or run with: `uv run --script ./script.py`
|
| 168 |
+
|
| 169 |
+
**Missing dependencies:**
|
| 170 |
+
```bash
|
| 171 |
+
pip install ud-hf-parquet-tools
|
| 172 |
+
```
|
| 173 |
+
|
| 174 |
+
**Submodule checkout fails:**
|
| 175 |
+
```bash
|
| 176 |
+
cd UD_repos
|
| 177 |
+
git submodule foreach 'git fetch --tags && (git checkout r2.17 || git checkout main)'
|
| 178 |
+
```
|
| 179 |
+
|
| 180 |
+
## Documentation
|
| 181 |
+
|
| 182 |
+
- **Comprehensive guide:** [ADDING_NEW_UD_VERSION.md](../ADDING_NEW_UD_VERSION.md)
|
| 183 |
+
- **Blocked treebanks:** [BLOCKED_TREEBANKS.md](BLOCKED_TREEBANKS.md)
|
| 184 |
+
- **Migration guide:** [MIGRATION.md](../MIGRATION.md)
|
| 185 |
+
- **Parquet tools:** https://github.com/bot-zen/ud-hf-parquet-tools
|