Requesting an update on dataset_infos.json
In dataset_infos.json dataset, it seems like in some parts the infos are worong, such as this case.
{
"config_name": "eng_Latn-kor_Hang",
"version": {
"version_str": "1.0.0",
"description": null,
"major": 1,
"minor": 0,
"patch": 0
},
"splits": {
"train": {
"name": "train",
"num_bytes": 0,
"num_examples": 0,
"dataset_name": "nllb"
} ...}
the num_bytes and num_examples are 0, which results in
NonMatchingSplitsSizesError: [{'expected': SplitInfo(name='train', num_bytes=0, num_examples=0, shard_lengths=None, dataset_name='nllb'), 'recorded': SplitInfo(name='train', num_bytes=2972647492, num_examples=19358582, shard_lengths=[2741000, 3039000, 3258000, 3411000, 3519000, 3390582], dataset_name='nllb')}] this error when loading code like this.
from datasets import load_dataset
train_dataset = load_dataset(
"allenai/nllb",
"eng_Latn-kor_Hang",
trust_remote_code=True,
) (dataset = 3.6.0 version). I can make a PR that updates dataset_infos.json into the correct information, or is there an other way to solve this problem?