File size: 1,859 Bytes
f30aa49
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
---
license: mit
---



# NS-Instruments

This is an audio classification dataset for **Music Instruments Classification**.

**Classes = 10    ,    Split = Train-Test**

## Structure

- `audios` folder contains audio files.
- `train.csv` for training split and `test.csv` for the testing split.

## Download
```python
import os
import zipfile
import shutil
import huggingface_hub
audio_datasets_path = "DATASET_PATH/Audio-Datasets"
if not os.path.exists(audio_datasets_path): print(f"Given {audio_datasets_path=} does not exist. Specify a valid path ending with 'Audio-Datasets' folder.")
huggingface_hub.snapshot_download(repo_id="MahiA/NS-Instruments", repo_type="dataset", local_dir=os.path.join(audio_datasets_path, "NS-Instruments"))
zipfile_path = os.path.join(audio_datasets_path, 'NS-Instruments', 'NS-Instruments.zip')
with zipfile.ZipFile(zipfile_path,"r") as zip_ref:
    zip_ref.extractall(os.path.join(audio_datasets_path, 'NS-Instruments'))
shutil.move(os.path.join(audio_datasets_path, 'NS-Instruments','NS-Instruments', 'audios'), os.path.join(audio_datasets_path, 'NS-Instruments'))
shutil.move(os.path.join(audio_datasets_path, 'NS-Instruments','NS-Instruments', 'train.csv'), os.path.join(audio_datasets_path, 'NS-Instruments'))
shutil.move(os.path.join(audio_datasets_path, 'NS-Instruments','NS-Instruments', 'test.csv'), os.path.join(audio_datasets_path, 'NS-Instruments'))
shutil.rmtree(os.path.join(audio_datasets_path, 'NS-Instruments', 'NS-Instruments'))
os.remove(zipfile_path)
```

## Acknowledgment

This dataset is a slightly processed/restructured version of data originally released by [Source](https://magenta.tensorflow.org/datasets/nsynth).<br>
Please refer to the respective source for their licensing details and any additional information.

## Contact
For questions or feedback, please create an issue.