File size: 827 Bytes
6e55d46
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
# general information
- the purpose of this the repository is to create a dataset loading script for a milling related dataset 
- further information about the dataset are in https://data.mendeley.com/datasets/zpxs87bjt8/3
- the dataset size is 10 Gigabit. hugging face doesn't allow uploading that amount of data. This rep contain a small section of the dataset (see *data* folder) only to prove the concept. loading script for the original dataset and the small dataset 

# how to load dataset
- first download the dataset from https://data.mendeley.com/datasets/zpxs87bjt8/3
- loading script (milling_LUH_data.py) should be at the same directory as data directory (the folder where all h5 files are)
- load data as shown in notebook.ipynb 
- define train, test, validation splits in the loading script (milling_LUH_data.py)