Datasets:
Switch UD fetch pipeline to CLARIN snapshots
Browse files- replace GitHub/submodule fetch flow with CLARIN allzip snapshot download
- rename 01_fetch_ud_repos.sh to 01_fetch_ud_snapshot.sh
- update metadata generation for snapshot layout and UD_VER-specific codes_and_flags
- update maintainer docs and tools README to reflect snapshot-based workflow
- ADDING_NEW_UD_VERSION.md +23 -38
- tools/01_fetch_ud_repos.sh +0 -136
- tools/01_fetch_ud_snapshot.sh +166 -0
- tools/02_generate_metadata.py +10 -6
- tools/README.md +18 -28
ADDING_NEW_UD_VERSION.md
CHANGED
|
@@ -114,48 +114,34 @@ cat .env
|
|
| 114 |
|
| 115 |
**How to find hash?** Check the [UD docs-automation releases](https://github.com/UniversalDependencies/docs/releases) for the commit tagged with the version.
|
| 116 |
|
| 117 |
-
### 6.
|
| 118 |
|
| 119 |
```bash
|
| 120 |
-
#
|
| 121 |
-
./
|
| 122 |
|
| 123 |
-
# This creates:
|
| 124 |
-
# - .
|
|
|
|
| 125 |
```
|
| 126 |
|
| 127 |
-
**What does this do?**
|
| 128 |
|
| 129 |
-
### 7.
|
| 130 |
|
| 131 |
```bash
|
| 132 |
-
|
| 133 |
-
|
| 134 |
-
# Initialize git repository (if first time)
|
| 135 |
-
git init
|
| 136 |
-
|
| 137 |
-
# Add all UD repositories as submodules
|
| 138 |
-
bash ../.UD_submodules_add.commands
|
| 139 |
-
|
| 140 |
-
# Checkout the new release tag in all submodules
|
| 141 |
-
git submodule foreach "git fetch --tags && git checkout r${NEW_VER} && touch .tag-r${NEW_VER}"
|
| 142 |
-
|
| 143 |
-
# Create branch and commit
|
| 144 |
-
git checkout -b ${NEW_VER}
|
| 145 |
-
git add -A
|
| 146 |
-
git commit -m "Add UD ${NEW_VER} repositories"
|
| 147 |
|
| 148 |
-
|
|
|
|
| 149 |
```
|
| 150 |
|
| 151 |
-
**Expected time:**
|
| 152 |
|
| 153 |
-
**
|
| 154 |
-
|
| 155 |
-
|
| 156 |
-
```bash
|
| 157 |
-
git submodule foreach "git fetch --tags && (git checkout r${NEW_VER} || git checkout main) && touch .tag-r${NEW_VER}"
|
| 158 |
-
```
|
| 159 |
|
| 160 |
### 8. Extract Metadata from Treebanks
|
| 161 |
|
|
@@ -408,18 +394,17 @@ This makes the new version the default when users don't specify a revision.
|
|
| 408 |
|
| 409 |
## Troubleshooting
|
| 410 |
|
| 411 |
-
### Issue:
|
| 412 |
|
| 413 |
-
**Problem:**
|
| 414 |
|
| 415 |
**Solution:**
|
| 416 |
```bash
|
| 417 |
-
cd tools
|
| 418 |
-
|
|
|
|
| 419 |
```
|
| 420 |
|
| 421 |
-
This falls back to `main` branch for repositories without the tag.
|
| 422 |
-
|
| 423 |
### Issue: Metadata extraction fails for a treebank
|
| 424 |
|
| 425 |
**Problem:** A treebank is malformed or missing expected files.
|
|
@@ -512,7 +497,7 @@ Before marking the release as complete:
|
|
| 512 |
| Step | Time | Notes |
|
| 513 |
|------|------|-------|
|
| 514 |
| 1-5: Setup & metadata | 10-15 min | Manual edits required |
|
| 515 |
-
| 6-7:
|
| 516 |
| 8-9: Generate metadata/README | 5-10 min | Fast |
|
| 517 |
| 10-12: Generate & validate Parquet | 2-4 hours | CPU-intensive |
|
| 518 |
| 13-15: Commit to git | 10-15 min | Fast |
|
|
@@ -520,7 +505,7 @@ Before marking the release as complete:
|
|
| 520 |
| 17-18: Verify & update | 10-20 min | Fast |
|
| 521 |
| **Total** | **~5-11 hours** | Can parallelize some steps |
|
| 522 |
|
| 523 |
-
**Recommendation:** Start the process in the morning. Long-running steps (
|
| 524 |
|
| 525 |
## Notes
|
| 526 |
|
|
|
|
| 114 |
|
| 115 |
**How to find hash?** Check the [UD docs-automation releases](https://github.com/UniversalDependencies/docs/releases) for the commit tagged with the version.
|
| 116 |
|
| 117 |
+
### 6. Download UD Snapshot from CLARIN
|
| 118 |
|
| 119 |
```bash
|
| 120 |
+
# Download and extract official UD release snapshot
|
| 121 |
+
./01_fetch_ud_snapshot.sh
|
| 122 |
|
| 123 |
+
# This creates/updates:
|
| 124 |
+
# - ud-treebanks-v2.18/ (extracted snapshot directory)
|
| 125 |
+
# - UD_repos -> ud-treebanks-v2.18 (symlink used by generation scripts)
|
| 126 |
```
|
| 127 |
|
| 128 |
+
**What does this do?** Downloads the official release archive from LINDAT/CLARIN (`https://hdl.handle.net/<handle>/allzip`), extracts it, and points `UD_repos` to the extracted snapshot.
|
| 129 |
|
| 130 |
+
### 7. Verify Snapshot Layout
|
| 131 |
|
| 132 |
```bash
|
| 133 |
+
# Confirm symlink and directory
|
| 134 |
+
ls -ld UD_repos ud-treebanks-v${NEW_VER}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 135 |
|
| 136 |
+
# Spot-check treebank directories
|
| 137 |
+
ls UD_repos | head
|
| 138 |
```
|
| 139 |
|
| 140 |
+
**Expected time:** 5-15 minutes depending on network speed and archive size.
|
| 141 |
|
| 142 |
+
**Notes:**
|
| 143 |
+
- The snapshot already contains a consistent release state for all treebanks.
|
| 144 |
+
- No per-treebank git tags/submodules are required.
|
|
|
|
|
|
|
|
|
|
| 145 |
|
| 146 |
### 8. Extract Metadata from Treebanks
|
| 147 |
|
|
|
|
| 394 |
|
| 395 |
## Troubleshooting
|
| 396 |
|
| 397 |
+
### Issue: Snapshot download/extraction fails
|
| 398 |
|
| 399 |
+
**Problem:** Snapshot download fails, or `UD_repos` does not point to the extracted directory.
|
| 400 |
|
| 401 |
**Solution:**
|
| 402 |
```bash
|
| 403 |
+
cd tools
|
| 404 |
+
./01_fetch_ud_snapshot.sh --online --rev 2.18
|
| 405 |
+
ls -ld UD_repos ud-treebanks-v2.18
|
| 406 |
```
|
| 407 |
|
|
|
|
|
|
|
| 408 |
### Issue: Metadata extraction fails for a treebank
|
| 409 |
|
| 410 |
**Problem:** A treebank is malformed or missing expected files.
|
|
|
|
| 497 |
| Step | Time | Notes |
|
| 498 |
|------|------|-------|
|
| 499 |
| 1-5: Setup & metadata | 10-15 min | Manual edits required |
|
| 500 |
+
| 6-7: Download snapshot | 5-15 min | Network + archive-size dependent |
|
| 501 |
| 8-9: Generate metadata/README | 5-10 min | Fast |
|
| 502 |
| 10-12: Generate & validate Parquet | 2-4 hours | CPU-intensive |
|
| 503 |
| 13-15: Commit to git | 10-15 min | Fast |
|
|
|
|
| 505 |
| 17-18: Verify & update | 10-20 min | Fast |
|
| 506 |
| **Total** | **~5-11 hours** | Can parallelize some steps |
|
| 507 |
|
| 508 |
+
**Recommendation:** Start the process in the morning. Long-running steps (Parquet generation and upload) can run unattended.
|
| 509 |
|
| 510 |
## Notes
|
| 511 |
|
tools/01_fetch_ud_repos.sh
DELETED
|
@@ -1,136 +0,0 @@
|
|
| 1 |
-
#!/usr/bin/env bash
|
| 2 |
-
#
|
| 3 |
-
# This script is based on:
|
| 4 |
-
# https://github.com/UniversalDependencies/docs-automation
|
| 5 |
-
#
|
| 6 |
-
# ./fetch_ud_repos.sh -h
|
| 7 |
-
#
|
| 8 |
-
|
| 9 |
-
# Set default vars and names directories and output files
|
| 10 |
-
[ -e .env ] && . .env
|
| 11 |
-
ONLINE_MODE=false
|
| 12 |
-
UD_VER=${UD_VER:-"2.15"}
|
| 13 |
-
UD_REV="r${UD_VER}"
|
| 14 |
-
UDS_SUBDIR="UD_repos"; mkdir -p ${UDS_SUBDIR}
|
| 15 |
-
GIT_API_OUTPUT=".UniversalDependencies.repos"
|
| 16 |
-
GIT_SUBMODULES_ADD=".UD_submodules_add.commands"
|
| 17 |
-
|
| 18 |
-
# Additional arguments for the `git submodule add` command(s)
|
| 19 |
-
|
| 20 |
-
###################
|
| 21 |
-
#############
|
| 22 |
-
#
|
| 23 |
-
# USAGE
|
| 24 |
-
#
|
| 25 |
-
usage() { OUTPUT=${1:-"verbose"}
|
| 26 |
-
echo "Usage: $0 [-o] [-r REV]" >&2
|
| 27 |
-
[ $OUTPUT = "short" ] && exit 0
|
| 28 |
-
>&2 cat << EOF
|
| 29 |
-
|
| 30 |
-
Identify all relevant UD repositories from GitHub:
|
| 31 |
-
* https://github.com/UniversalDependencies/
|
| 32 |
-
and generate a list of submodules to add to a (newly created) repository in the
|
| 33 |
-
subdirectory ${UDS_SUBDIR}/.
|
| 34 |
-
|
| 35 |
-
"${GIT_API_OUTPUT}" contains the (saved) list of GitHub repositories; the
|
| 36 |
-
file can be updated by running this script with the '-o|--online' option.
|
| 37 |
-
"${GIT_SUBMODULES_ADD}" contains the list of submodules to add to the main
|
| 38 |
-
repository.
|
| 39 |
-
|
| 40 |
-
Options:
|
| 41 |
-
-h Print this help message
|
| 42 |
-
-o|--online Enable online mode: (re-fetch UD repos from GitHub instead
|
| 43 |
-
of using ${GIT_API_OUTPUT})
|
| 44 |
-
-r|--rev VER The UD GitHub tagged version to checkout (default: ${UD_VER})
|
| 45 |
-
EOF
|
| 46 |
-
}
|
| 47 |
-
#
|
| 48 |
-
######
|
| 49 |
-
############
|
| 50 |
-
##################
|
| 51 |
-
|
| 52 |
-
VALID_ARGS=$(getopt -o hor: --long help,online,rev: -- "$@")
|
| 53 |
-
if [[ $? -ne 0 ]]; then
|
| 54 |
-
usage "short"
|
| 55 |
-
exit 1;
|
| 56 |
-
fi
|
| 57 |
-
|
| 58 |
-
eval set -- "$VALID_ARGS"
|
| 59 |
-
while [ : ]
|
| 60 |
-
do
|
| 61 |
-
case "$1" in
|
| 62 |
-
-o | --online) ONLINE_MODE=true; shift ;;
|
| 63 |
-
-r | --rev) UD_VER="$2"; shift 2 ;;
|
| 64 |
-
-h | --help) usage; shift ; exit 0;;
|
| 65 |
-
--) shift ; break ;;
|
| 66 |
-
*) >&2 echo Unsupported option: $1; usage; exit 1;;
|
| 67 |
-
esac
|
| 68 |
-
done
|
| 69 |
-
|
| 70 |
-
set -uo pipefail
|
| 71 |
-
|
| 72 |
-
# Function to get github API content
|
| 73 |
-
get_github_api_content() {
|
| 74 |
-
|
| 75 |
-
# We are interested in these keys:
|
| 76 |
-
#
|
| 77 |
-
# "git_url": "git://github.com/UniversalDependencies/docs.git",
|
| 78 |
-
# "ssh_url": "git@github.com:UniversalDependencies/docs.git",
|
| 79 |
-
# "clone_url": "https://github.com/UniversalDependencies/docs.git",
|
| 80 |
-
# "svn_url": "https://github.com/UniversalDependencies/docs",
|
| 81 |
-
# "homepage": "http://universaldependencies.org/",
|
| 82 |
-
#
|
| 83 |
-
# -> "clone_url"
|
| 84 |
-
|
| 85 |
-
#This will last us to 500 repositories (UD has 460+ as of 2026)
|
| 86 |
-
(for pg in 1 2 3 4 5; do wget "https://api.github.com/orgs/UniversalDependencies/repos?page=$pg&per_page=100" -O - ; done ) \
|
| 87 |
-
| grep clone_url \
|
| 88 |
-
| grep -Po 'https://.*?(?=")' \
|
| 89 |
-
> ${UDS_SUBDIR}/${GIT_API_OUTPUT}
|
| 90 |
-
}
|
| 91 |
-
|
| 92 |
-
# Check if the online argument is present
|
| 93 |
-
if [[ $ONLINE_MODE == true ]] || [[ ! -e ${UDS_SUBDIR}/${GIT_API_OUTPUT} ]]; then
|
| 94 |
-
# Download github API content
|
| 95 |
-
get_github_api_content
|
| 96 |
-
fi
|
| 97 |
-
|
| 98 |
-
|
| 99 |
-
MSG=""
|
| 100 |
-
echo "" > ${UDS_SUBDIR}/${GIT_SUBMODULES_ADD}
|
| 101 |
-
for r in $(cat ${UDS_SUBDIR}/${GIT_API_OUTPUT} | sort)
|
| 102 |
-
do
|
| 103 |
-
l=$(echo $r | perl -pe 's/.*\///' | cut -f 1 -d.)
|
| 104 |
-
|
| 105 |
-
# Process all potential UD_ repos
|
| 106 |
-
if [[ $l == UD_* ]] && [[ $l != UD_v2 ]]
|
| 107 |
-
then
|
| 108 |
-
# Create a file for the full set
|
| 109 |
-
echo "git submodule add --depth 1 -- $r $l" \
|
| 110 |
-
>> ${UDS_SUBDIR}/${GIT_SUBMODULES_ADD}
|
| 111 |
-
|
| 112 |
-
# Missing subdir
|
| 113 |
-
if [[ ! -e ${UDS_SUBDIR}/$l ]]
|
| 114 |
-
then
|
| 115 |
-
# git submodule add
|
| 116 |
-
# [-b <branch>] [-f|--force] [--name <name>] [--reference <repository>]
|
| 117 |
-
# [--ref-format <format>] [--depth <depth>] [--] <repository> [<path>]
|
| 118 |
-
|
| 119 |
-
MSG="$MSG\n${UDS_SUBDIR}/$l"
|
| 120 |
-
fi
|
| 121 |
-
fi
|
| 122 |
-
done
|
| 123 |
-
|
| 124 |
-
# git show-ref refs/tags/${UD_REV}
|
| 125 |
-
# echo "git submodule --quiet foreach 'export name; bash -c \"(git fetch --depth 1 origin refs/tags/${UD_REV}:refs/tags/${UD_REV} ) || (echo \$name; pushd ../..; git submodule deinit -f \$name; popd)\" '" >> ${GIT_SUBMODULES_ADD}
|
| 126 |
-
echo "git submodule --quiet foreach 'export name; bash -c \"( REF=\$(git ls-remote --exit-code origin refs/tags/${UD_REV} | cut -f1) && [ ! -z \\\$REF ] && git fetch --depth=1 origin \\\$REF && git checkout \\\$REF && echo \\\$REF > .tag-${UD_REV}) || ( echo No rev:${UD_REV} \\\$name )\" ' " >> ${UDS_SUBDIR}/${GIT_SUBMODULES_ADD}
|
| 127 |
-
echo "git submodule --quiet foreach git status -s | grep -v .tag-" >> ${UDS_SUBDIR}/${GIT_SUBMODULES_ADD}
|
| 128 |
-
|
| 129 |
-
|
| 130 |
-
echo "Missing repos:"
|
| 131 |
-
echo -e "$MSG"
|
| 132 |
-
|
| 133 |
-
if [[ ! -e ${UDS_SUBDIR}/.git ]]
|
| 134 |
-
then
|
| 135 |
-
echo "${UDS_SUBDIR}: "'missing .git: run `git init` in subdirectory!'
|
| 136 |
-
fi
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tools/01_fetch_ud_snapshot.sh
ADDED
|
@@ -0,0 +1,166 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env bash
|
| 2 |
+
#
|
| 3 |
+
# Fetch UD release snapshots from the LINDAT/CLARIN repository.
|
| 4 |
+
#
|
| 5 |
+
# NOTE:
|
| 6 |
+
# This script intentionally uses snapshot-oriented naming and does not
|
| 7 |
+
# fetch GitHub repositories/submodules.
|
| 8 |
+
#
|
| 9 |
+
|
| 10 |
+
set -euo pipefail
|
| 11 |
+
|
| 12 |
+
[ -e .env ] && . .env
|
| 13 |
+
|
| 14 |
+
ONLINE_MODE=false
|
| 15 |
+
UD_VER=${UD_VER:-"2.17"}
|
| 16 |
+
UDS_SUBDIR="UD_repos"
|
| 17 |
+
|
| 18 |
+
# Mapping UD version -> LINDAT handle.
|
| 19 |
+
# Keep this in sync with tools/00_fetch_ud_clarin-dspace_metadata.py.
|
| 20 |
+
declare -A HANDLE_MAPPING
|
| 21 |
+
HANDLE_MAPPING["2.7"]="11234/1-3424"
|
| 22 |
+
HANDLE_MAPPING["2.8"]="11234/1-3687"
|
| 23 |
+
HANDLE_MAPPING["2.9"]="11234/1-4611"
|
| 24 |
+
HANDLE_MAPPING["2.10"]="11234/1-4758"
|
| 25 |
+
HANDLE_MAPPING["2.11"]="11234/1-4923"
|
| 26 |
+
HANDLE_MAPPING["2.12"]="11234/1-5150"
|
| 27 |
+
HANDLE_MAPPING["2.13"]="11234/1-5287"
|
| 28 |
+
HANDLE_MAPPING["2.14"]="11234/1-5502"
|
| 29 |
+
HANDLE_MAPPING["2.15"]="11234/1-5787"
|
| 30 |
+
HANDLE_MAPPING["2.16"]="11234/1-5901"
|
| 31 |
+
HANDLE_MAPPING["2.17"]="11234/1-6036"
|
| 32 |
+
|
| 33 |
+
###################
|
| 34 |
+
#############
|
| 35 |
+
#
|
| 36 |
+
# USAGE
|
| 37 |
+
#
|
| 38 |
+
usage() { OUTPUT=${1:-"verbose"}
|
| 39 |
+
echo "Usage: $0 [-o] [-r VER]" >&2
|
| 40 |
+
[ "$OUTPUT" = "short" ] && exit 0
|
| 41 |
+
>&2 cat << EOF
|
| 42 |
+
|
| 43 |
+
Download and extract one official Universal Dependencies snapshot from:
|
| 44 |
+
* https://lindat.mff.cuni.cz/repository/
|
| 45 |
+
|
| 46 |
+
The snapshot is downloaded as:
|
| 47 |
+
* https://hdl.handle.net/<handle>/allzip
|
| 48 |
+
|
| 49 |
+
and extracted to:
|
| 50 |
+
* ud-treebanks-v<VER>/
|
| 51 |
+
|
| 52 |
+
Then a symlink is created/updated:
|
| 53 |
+
* UD_repos -> ud-treebanks-v<VER>
|
| 54 |
+
|
| 55 |
+
Options:
|
| 56 |
+
-h Print this help message
|
| 57 |
+
-o|--online Force re-download and re-extract
|
| 58 |
+
-r|--rev VER UD version to fetch (default: \${UD_VER} from .env)
|
| 59 |
+
EOF
|
| 60 |
+
}
|
| 61 |
+
#
|
| 62 |
+
######
|
| 63 |
+
############
|
| 64 |
+
##################
|
| 65 |
+
|
| 66 |
+
VALID_ARGS=$(getopt -o hor: --long help,online,rev: -- "$@")
|
| 67 |
+
if [[ $? -ne 0 ]]; then
|
| 68 |
+
usage "short"
|
| 69 |
+
exit 1
|
| 70 |
+
fi
|
| 71 |
+
|
| 72 |
+
eval set -- "$VALID_ARGS"
|
| 73 |
+
while [ : ]
|
| 74 |
+
do
|
| 75 |
+
case "$1" in
|
| 76 |
+
-o | --online) ONLINE_MODE=true; shift ;;
|
| 77 |
+
-r | --rev) UD_VER="$2"; shift 2 ;;
|
| 78 |
+
-h | --help) usage; shift ; exit 0 ;;
|
| 79 |
+
--) shift ; break ;;
|
| 80 |
+
*) >&2 echo "Unsupported option: $1"; usage; exit 1 ;;
|
| 81 |
+
esac
|
| 82 |
+
done
|
| 83 |
+
|
| 84 |
+
HANDLE="${HANDLE_MAPPING[${UD_VER}]:-}"
|
| 85 |
+
if [[ -z "${HANDLE}" ]]; then
|
| 86 |
+
>&2 echo "Unsupported UD version: ${UD_VER}"
|
| 87 |
+
>&2 echo "Update HANDLE_MAPPING in $(basename "$0") first."
|
| 88 |
+
exit 1
|
| 89 |
+
fi
|
| 90 |
+
|
| 91 |
+
SNAPSHOT_DIR="ud-treebanks-v${UD_VER}"
|
| 92 |
+
SNAPSHOT_ARCHIVE="${SNAPSHOT_DIR}.zip"
|
| 93 |
+
SNAPSHOT_URL="https://hdl.handle.net/${HANDLE}/allzip"
|
| 94 |
+
|
| 95 |
+
download_snapshot() {
|
| 96 |
+
echo "Downloading UD ${UD_VER} snapshot from ${SNAPSHOT_URL}"
|
| 97 |
+
if command -v curl >/dev/null 2>&1; then
|
| 98 |
+
curl -fL --retry 3 --retry-delay 1 "${SNAPSHOT_URL}" -o "${SNAPSHOT_ARCHIVE}"
|
| 99 |
+
elif command -v wget >/dev/null 2>&1; then
|
| 100 |
+
wget "${SNAPSHOT_URL}" -O "${SNAPSHOT_ARCHIVE}"
|
| 101 |
+
else
|
| 102 |
+
python3 - "${SNAPSHOT_URL}" "${SNAPSHOT_ARCHIVE}" <<'PY'
|
| 103 |
+
import sys
|
| 104 |
+
from urllib.request import urlretrieve
|
| 105 |
+
|
| 106 |
+
urlretrieve(sys.argv[1], sys.argv[2])
|
| 107 |
+
PY
|
| 108 |
+
fi
|
| 109 |
+
}
|
| 110 |
+
|
| 111 |
+
extract_snapshot() {
|
| 112 |
+
local tmp_dir backup_dir
|
| 113 |
+
tmp_dir=$(mktemp -d)
|
| 114 |
+
echo "Extracting ${SNAPSHOT_ARCHIVE} into ${SNAPSHOT_DIR}"
|
| 115 |
+
|
| 116 |
+
if command -v unzip >/dev/null 2>&1; then
|
| 117 |
+
unzip -q "${SNAPSHOT_ARCHIVE}" -d "${tmp_dir}"
|
| 118 |
+
else
|
| 119 |
+
python3 - "${SNAPSHOT_ARCHIVE}" "${tmp_dir}" <<'PY'
|
| 120 |
+
import sys
|
| 121 |
+
import zipfile
|
| 122 |
+
|
| 123 |
+
with zipfile.ZipFile(sys.argv[1]) as zf:
|
| 124 |
+
zf.extractall(sys.argv[2])
|
| 125 |
+
PY
|
| 126 |
+
fi
|
| 127 |
+
|
| 128 |
+
mapfile -t snapshot_candidates < <(find "${tmp_dir}" -mindepth 1 -maxdepth 1 -type d -name 'ud-treebanks-v*' | sort)
|
| 129 |
+
if [[ ${#snapshot_candidates[@]} -ne 1 ]]; then
|
| 130 |
+
>&2 echo "Expected one ud-treebanks-v* directory in archive, found ${#snapshot_candidates[@]}."
|
| 131 |
+
>&2 echo "Archive: ${SNAPSHOT_ARCHIVE}"
|
| 132 |
+
exit 1
|
| 133 |
+
fi
|
| 134 |
+
|
| 135 |
+
if [[ -d "${SNAPSHOT_DIR}" ]]; then
|
| 136 |
+
backup_dir="${SNAPSHOT_DIR}.bak.$(date +%s)"
|
| 137 |
+
echo "Existing ${SNAPSHOT_DIR} found; moving it to ${backup_dir}"
|
| 138 |
+
mv "${SNAPSHOT_DIR}" "${backup_dir}"
|
| 139 |
+
fi
|
| 140 |
+
|
| 141 |
+
mv "${snapshot_candidates[0]}" "${SNAPSHOT_DIR}"
|
| 142 |
+
rm -rf "${tmp_dir}"
|
| 143 |
+
}
|
| 144 |
+
|
| 145 |
+
if [[ "${ONLINE_MODE}" == true ]] || [[ ! -d "${SNAPSHOT_DIR}" ]]; then
|
| 146 |
+
if [[ "${ONLINE_MODE}" == true ]] || [[ ! -s "${SNAPSHOT_ARCHIVE}" ]]; then
|
| 147 |
+
download_snapshot
|
| 148 |
+
else
|
| 149 |
+
echo "Using existing archive ${SNAPSHOT_ARCHIVE}"
|
| 150 |
+
fi
|
| 151 |
+
extract_snapshot
|
| 152 |
+
else
|
| 153 |
+
echo "Snapshot directory already exists: ${SNAPSHOT_DIR}"
|
| 154 |
+
fi
|
| 155 |
+
|
| 156 |
+
if [[ -L "${UDS_SUBDIR}" ]]; then
|
| 157 |
+
ln -sfn "${SNAPSHOT_DIR}" "${UDS_SUBDIR}"
|
| 158 |
+
elif [[ ! -e "${UDS_SUBDIR}" ]]; then
|
| 159 |
+
ln -s "${SNAPSHOT_DIR}" "${UDS_SUBDIR}"
|
| 160 |
+
else
|
| 161 |
+
>&2 echo "${UDS_SUBDIR} exists and is not a symlink. Please move/remove it, then rerun."
|
| 162 |
+
exit 1
|
| 163 |
+
fi
|
| 164 |
+
|
| 165 |
+
echo "UD snapshot ready:"
|
| 166 |
+
echo " ${UDS_SUBDIR} -> ${SNAPSHOT_DIR}"
|
tools/02_generate_metadata.py
CHANGED
|
@@ -111,7 +111,7 @@ def traverse_directory(directory):
|
|
| 111 |
"""
|
| 112 |
results = defaultdict(lambda: defaultdict(dict))
|
| 113 |
|
| 114 |
-
with open(os.path.join('etc', f"codes_and_flags-
|
| 115 |
codes_and_flags = yaml.safe_load(file)
|
| 116 |
logging.debug(codes_and_flags)
|
| 117 |
|
|
@@ -124,9 +124,10 @@ def traverse_directory(directory):
|
|
| 124 |
logging.debug(dir_path)
|
| 125 |
|
| 126 |
tag_fn = os.path.join(dir_path, f".tag-r{UD_VER}")
|
| 127 |
-
|
| 128 |
-
|
| 129 |
-
|
|
|
|
| 130 |
|
| 131 |
results[item]["splits"] = {
|
| 132 |
"train": {"files": [], "num_bytes": 0},
|
|
@@ -137,7 +138,10 @@ def traverse_directory(directory):
|
|
| 137 |
for file in os.listdir(dir_path):
|
| 138 |
if file.endswith(".conllu"):
|
| 139 |
file_path = os.path.join(dir_path, file)
|
| 140 |
-
|
|
|
|
|
|
|
|
|
|
| 141 |
logging.debug(file_path)
|
| 142 |
match file:
|
| 143 |
case x if "dev" in file:
|
|
@@ -148,7 +152,7 @@ def traverse_directory(directory):
|
|
| 148 |
subset = "train"
|
| 149 |
case _:
|
| 150 |
subset = "unknown"
|
| 151 |
-
results[item]["splits"][subset]["files"].append(
|
| 152 |
|
| 153 |
sum_bytes = os.stat(file_path).st_size
|
| 154 |
results[item]["splits"][subset]["num_bytes"] += sum_bytes
|
|
|
|
| 111 |
"""
|
| 112 |
results = defaultdict(lambda: defaultdict(dict))
|
| 113 |
|
| 114 |
+
with open(os.path.join('etc', f"codes_and_flags-{UD_VER}.yaml"), 'r') as file:
|
| 115 |
codes_and_flags = yaml.safe_load(file)
|
| 116 |
logging.debug(codes_and_flags)
|
| 117 |
|
|
|
|
| 124 |
logging.debug(dir_path)
|
| 125 |
|
| 126 |
tag_fn = os.path.join(dir_path, f".tag-r{UD_VER}")
|
| 127 |
+
has_version_tag = Path(tag_fn).exists()
|
| 128 |
+
if not has_version_tag:
|
| 129 |
+
# Snapshot layouts from CLARIN do not include per-treebank tag marker files.
|
| 130 |
+
logging.debug(f"No tag file found (expected for snapshots): {tag_fn}")
|
| 131 |
|
| 132 |
results[item]["splits"] = {
|
| 133 |
"train": {"files": [], "num_bytes": 0},
|
|
|
|
| 138 |
for file in os.listdir(dir_path):
|
| 139 |
if file.endswith(".conllu"):
|
| 140 |
file_path = os.path.join(dir_path, file)
|
| 141 |
+
if has_version_tag:
|
| 142 |
+
source_path = os.path.join(item, f"r{UD_VER}", file)
|
| 143 |
+
else:
|
| 144 |
+
source_path = os.path.join(item, file)
|
| 145 |
logging.debug(file_path)
|
| 146 |
match file:
|
| 147 |
case x if "dev" in file:
|
|
|
|
| 152 |
subset = "train"
|
| 153 |
case _:
|
| 154 |
subset = "unknown"
|
| 155 |
+
results[item]["splits"][subset]["files"].append(source_path)
|
| 156 |
|
| 157 |
sum_bytes = os.stat(file_path).st_size
|
| 158 |
results[item]["splits"][subset]["num_bytes"] += sum_bytes
|
tools/README.md
CHANGED
|
@@ -11,22 +11,15 @@ cd tools
|
|
| 11 |
echo "UD_VER=${NEW_VER}" > .env
|
| 12 |
|
| 13 |
# Fetch metadata
|
| 14 |
-
|
| 15 |
./00_fetch_ud_codes_and_flags.sh -o
|
| 16 |
-
./
|
| 17 |
-
|
| 18 |
-
# Initialize repositories
|
| 19 |
-
cd UD_repos
|
| 20 |
-
git init
|
| 21 |
-
bash ../.UD_submodules_add.commands
|
| 22 |
-
git submodule foreach "git fetch --tags && git checkout r${NEW_VER} && touch .tag-r${NEW_VER}"
|
| 23 |
-
cd ..
|
| 24 |
|
| 25 |
# Generate files
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
uv run
|
| 29 |
-
uv run
|
| 30 |
|
| 31 |
# Sync to root
|
| 32 |
cp README-${NEW_VER} ../README.md
|
|
@@ -50,12 +43,12 @@ git add ../parquet ../README.md ../metadata.json
|
|
| 50 |
- Creates: `etc/codes_and_flags-{VER}.yaml`, `etc/codes_and_flags-latest.yaml`
|
| 51 |
- Update `VER_MAPPING` for new versions
|
| 52 |
|
| 53 |
-
### 01 - Fetch
|
| 54 |
|
| 55 |
-
**
|
| 56 |
-
-
|
| 57 |
-
- Creates: `
|
| 58 |
-
-
|
| 59 |
|
| 60 |
### 02 - Generate Metadata
|
| 61 |
|
|
@@ -93,7 +86,7 @@ git add ../parquet ../README.md ../metadata.json
|
|
| 93 |
tools/
|
| 94 |
├── 00_fetch_ud_clarin-dspace_metadata.py # Fetch citation/description
|
| 95 |
├── 00_fetch_ud_codes_and_flags.sh # Fetch language metadata
|
| 96 |
-
├──
|
| 97 |
├── 02_generate_metadata.py # Extract treebank metadata
|
| 98 |
├── 03_generate_README.py # Generate dataset card
|
| 99 |
├── 04_generate_parquet.py # Generate Parquet files
|
|
@@ -107,9 +100,8 @@ tools/
|
|
| 107 |
│ └── README.tmpl # Jinja2 template for dataset card
|
| 108 |
├── metadata-{VER}.json # Generated metadata (output)
|
| 109 |
├── README-{VER} # Generated dataset card (output)
|
| 110 |
-
|
| 111 |
-
|
| 112 |
-
└── .UD_submodules_add.commands # Generated submodule commands
|
| 113 |
```
|
| 114 |
|
| 115 |
## Configuration
|
|
@@ -132,7 +124,6 @@ pt_cintil:
|
|
| 132 |
- Python 3.12+
|
| 133 |
- `uv` (for running scripts with dependencies)
|
| 134 |
- `ud-hf-parquet-tools` (for parquet generation/validation)
|
| 135 |
-
- Git with submodules support
|
| 136 |
|
| 137 |
Install dependencies:
|
| 138 |
```bash
|
|
@@ -162,19 +153,18 @@ uv run ./04_generate_parquet.py --blocked-treebanks blocked_treebanks.yaml
|
|
| 162 |
## Troubleshooting
|
| 163 |
|
| 164 |
**Script not found:**
|
| 165 |
-
- Scripts use `#!/usr/bin/env -S uv run
|
| 166 |
- Make executable: `chmod +x *.py *.sh`
|
| 167 |
-
- Or run with: `uv run
|
| 168 |
|
| 169 |
**Missing dependencies:**
|
| 170 |
```bash
|
| 171 |
pip install ud-hf-parquet-tools
|
| 172 |
```
|
| 173 |
|
| 174 |
-
**
|
| 175 |
```bash
|
| 176 |
-
|
| 177 |
-
git submodule foreach 'git fetch --tags && (git checkout r2.17 || git checkout main)'
|
| 178 |
```
|
| 179 |
|
| 180 |
## Documentation
|
|
|
|
| 11 |
echo "UD_VER=${NEW_VER}" > .env
|
| 12 |
|
| 13 |
# Fetch metadata
|
| 14 |
+
uv run 00_fetch_ud_clarin-dspace_metadata.py -o
|
| 15 |
./00_fetch_ud_codes_and_flags.sh -o
|
| 16 |
+
./01_fetch_ud_snapshot.sh
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 17 |
|
| 18 |
# Generate files
|
| 19 |
+
uv run 02_generate_metadata.py -o
|
| 20 |
+
uv run 03_generate_README.py -o
|
| 21 |
+
uv run 04_generate_parquet.py
|
| 22 |
+
uv run 05_validate_parquet.py --local --test
|
| 23 |
|
| 24 |
# Sync to root
|
| 25 |
cp README-${NEW_VER} ../README.md
|
|
|
|
| 43 |
- Creates: `etc/codes_and_flags-{VER}.yaml`, `etc/codes_and_flags-latest.yaml`
|
| 44 |
- Update `VER_MAPPING` for new versions
|
| 45 |
|
| 46 |
+
### 01 - Fetch Snapshot
|
| 47 |
|
| 48 |
+
**01_fetch_ud_snapshot.sh**
|
| 49 |
+
- Downloads official UD release snapshot from LINDAT/CLARIN (`allzip`)
|
| 50 |
+
- Creates: `ud-treebanks-v{VER}/`
|
| 51 |
+
- Creates/updates symlink: `UD_repos -> ud-treebanks-v{VER}`
|
| 52 |
|
| 53 |
### 02 - Generate Metadata
|
| 54 |
|
|
|
|
| 86 |
tools/
|
| 87 |
├── 00_fetch_ud_clarin-dspace_metadata.py # Fetch citation/description
|
| 88 |
├── 00_fetch_ud_codes_and_flags.sh # Fetch language metadata
|
| 89 |
+
├── 01_fetch_ud_snapshot.sh # Download UD snapshot from CLARIN
|
| 90 |
├── 02_generate_metadata.py # Extract treebank metadata
|
| 91 |
├── 03_generate_README.py # Generate dataset card
|
| 92 |
├── 04_generate_parquet.py # Generate Parquet files
|
|
|
|
| 100 |
│ └── README.tmpl # Jinja2 template for dataset card
|
| 101 |
├── metadata-{VER}.json # Generated metadata (output)
|
| 102 |
├── README-{VER} # Generated dataset card (output)
|
| 103 |
+
├── ud-treebanks-v{VER}/ # Extracted CLARIN snapshot
|
| 104 |
+
└── UD_repos -> ud-treebanks-v{VER} # Symlink consumed by 02/04/05 scripts
|
|
|
|
| 105 |
```
|
| 106 |
|
| 107 |
## Configuration
|
|
|
|
| 124 |
- Python 3.12+
|
| 125 |
- `uv` (for running scripts with dependencies)
|
| 126 |
- `ud-hf-parquet-tools` (for parquet generation/validation)
|
|
|
|
| 127 |
|
| 128 |
Install dependencies:
|
| 129 |
```bash
|
|
|
|
| 153 |
## Troubleshooting
|
| 154 |
|
| 155 |
**Script not found:**
|
| 156 |
+
- Scripts use `#!/usr/bin/env -S uv run` shebang
|
| 157 |
- Make executable: `chmod +x *.py *.sh`
|
| 158 |
+
- Or run with: `uv run ./script.py`
|
| 159 |
|
| 160 |
**Missing dependencies:**
|
| 161 |
```bash
|
| 162 |
pip install ud-hf-parquet-tools
|
| 163 |
```
|
| 164 |
|
| 165 |
+
**Snapshot download/extraction fails:**
|
| 166 |
```bash
|
| 167 |
+
./01_fetch_ud_snapshot.sh --online --rev 2.17
|
|
|
|
| 168 |
```
|
| 169 |
|
| 170 |
## Documentation
|