|
|
--- |
|
|
license: mit |
|
|
--- |
|
|
|
|
|
|
|
|
|
|
|
# NS-Instruments |
|
|
|
|
|
This is an audio classification dataset for **Music Instruments Classification**. |
|
|
|
|
|
**Classes = 10 , Split = Train-Test** |
|
|
|
|
|
## Structure |
|
|
|
|
|
- `audios` folder contains audio files. |
|
|
- `train.csv` for training split and `test.csv` for the testing split. |
|
|
|
|
|
## Download |
|
|
```python |
|
|
import os |
|
|
import zipfile |
|
|
import shutil |
|
|
import huggingface_hub |
|
|
audio_datasets_path = "DATASET_PATH/Audio-Datasets" |
|
|
if not os.path.exists(audio_datasets_path): print(f"Given {audio_datasets_path=} does not exist. Specify a valid path ending with 'Audio-Datasets' folder.") |
|
|
huggingface_hub.snapshot_download(repo_id="MahiA/NS-Instruments", repo_type="dataset", local_dir=os.path.join(audio_datasets_path, "NS-Instruments")) |
|
|
zipfile_path = os.path.join(audio_datasets_path, 'NS-Instruments', 'NS-Instruments.zip') |
|
|
with zipfile.ZipFile(zipfile_path,"r") as zip_ref: |
|
|
zip_ref.extractall(os.path.join(audio_datasets_path, 'NS-Instruments')) |
|
|
shutil.move(os.path.join(audio_datasets_path, 'NS-Instruments','NS-Instruments', 'audios'), os.path.join(audio_datasets_path, 'NS-Instruments')) |
|
|
shutil.move(os.path.join(audio_datasets_path, 'NS-Instruments','NS-Instruments', 'train.csv'), os.path.join(audio_datasets_path, 'NS-Instruments')) |
|
|
shutil.move(os.path.join(audio_datasets_path, 'NS-Instruments','NS-Instruments', 'test.csv'), os.path.join(audio_datasets_path, 'NS-Instruments')) |
|
|
shutil.rmtree(os.path.join(audio_datasets_path, 'NS-Instruments', 'NS-Instruments')) |
|
|
os.remove(zipfile_path) |
|
|
``` |
|
|
|
|
|
## Acknowledgment |
|
|
|
|
|
This dataset is a slightly processed/restructured version of data originally released by [Source](https://magenta.tensorflow.org/datasets/nsynth).<br> |
|
|
Please refer to the respective source for their licensing details and any additional information. |
|
|
|
|
|
## Contact |
|
|
For questions or feedback, please create an issue. |