Rename tokenizer_robustness_completion_stem_fullwidth_characters/test-00000-of-00001.parquet to toksuite_stem_fullwidth_characters/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_morpheme_separation/test-00000-of-00001.parquet to toksuite_stem_morpheme_separation/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_superscript_subscript/test-00000-of-00001.parquet to toksuite_stem_superscript_subscript/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_typographical_errors/test-00000-of-00001.parquet to toksuite_stem_typographical_errors/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_unicode_formatting/test-00000-of-00001.parquet to toksuite_stem_unicode_formatting/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_unusual_formatting/test-00000-of-00001.parquet to toksuite_stem_unusual_formatting/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_upside_down_rotated/test-00000-of-00001.parquet to toksuite_stem_upside_down_rotated/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_equivalent_expressions/test-00000-of-00001.parquet to toksuite_stem_equivalent_expressions/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_enclosed_characters/test-00000-of-00001.parquet to toksuite_stem_enclosed_characters/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_diacriticized_styling/test-00000-of-00001.parquet to toksuite_stem_diacriticized_styling/test-00000-of-00001.parquet
Rename tokenizer_robustness_completion_stem_character_deletion/test-00000-of-00001.parquet to toksuite_stem_character_deletion/test-00000-of-00001.parquet