File size: 639 Bytes
f9f2fc4 db75c52 f9f2fc4 db75c52 f9f2fc4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
license: bigscience-openrail-m
datasets:
- iamplus/Instruction_Tuning
---
Instruction Tuned Bloomz-7B1 Model on Stanford Alpaca-2 Instruction Tuning dataset (outputs from ChatGPT) (52k data) using ***Colossal AI***
**Base Model:** bigscience/bloomz-7b1
**Training Details :**
* Epochs: 5
* Batch Size : 16 instantaneous per device x 1 gradient accumulation steps x 8 gpus = 128
* Max Length : 1024
* Weight Decay : 0
* Learning Rate : 2e-5
* Learning Rate Scheduler Type : Cosine
* Number of warmup steps : 30
* Machine : 8xA100 80GB
**Dataset Details :**
Dataset : iamplus/Instruction_Tuning
Files :
* stanford_alpaca_it_v2.csv |