cterdam commited on
Commit
40b40a6
·
verified ·
1 Parent(s): 76bc73f

Document invariants and add test instructions

Browse files
Files changed (1) hide show
  1. README.md +41 -0
README.md CHANGED
@@ -66,3 +66,44 @@ print(ru_subdivs[0])
66
 
67
  - `cterdam/iso_3166-1_alpha-2`: Countries/regions with lang and hash
68
  - `cterdam/iata-metro`: IATA metro codes with lang and hash and coordinates
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
66
 
67
  - `cterdam/iso_3166-1_alpha-2`: Countries/regions with lang and hash
68
  - `cterdam/iata-metro`: IATA metro codes with lang and hash and coordinates
69
+
70
+
71
+ ## Invariants
72
+
73
+ All entries in this dataset maintain the following invariants (verified by `test/test_invariants.py`):
74
+
75
+ ### Name Field
76
+ - **Required**: Every entry has a `name` field
77
+ - **Constraint**: `name` always equals one of `name_en`, `name_ru`, or `name_zh` (based on `lang`)
78
+ - **Example**: If `lang: ru`, then `name == name_ru`
79
+
80
+ ### Lang Field
81
+ - **Required**: Every entry has a `lang` field
82
+ - **Values**: One of `en`, `ru`, or `zh`
83
+ - **Determination**: Hardcoded by region (not inferred from text)
84
+ - **Slavic regions** (RU, UA, BY, KZ, etc.) → `lang: ru`
85
+ - **Sinophone regions** (CN, TW, HK, JP, etc.) → `lang: zh`
86
+ - **All others** → `lang: en`
87
+
88
+ ### Hash Field
89
+ - **Required**: Every entry with a non-null `name` has a `hash` field
90
+ - **Format**: 64-character hexadecimal string (SHA256)
91
+ - **Constraint**: `hash == SHA256(name)`
92
+ - **Usage**: First 6 hex digits used for sharding in source repositories
93
+
94
+ ## Verify Invariants
95
+
96
+ Clone the dataset and run tests:
97
+
98
+ \`\`\`bash
99
+ git clone https://huggingface.co/datasets/cterdam/iso_3166-1_alpha-2
100
+ cd iso_3166-1_alpha-2
101
+
102
+ # Install dependencies
103
+ pip install datasets
104
+
105
+ # Run invariant tests
106
+ pytest test/test_invariants.py -v
107
+ \`\`\`
108
+
109
+ Expected output: All tests pass, verifying dataset integrity.