| | --- |
| | license: mit |
| | --- |
| | # Lora finetuning on Wikipedia-10, applying counter factual data augmentation (CDA) |
| | - Dataset: Wikipedia-10 |
| | - Target modules = ["q_proj", "k_proj", "v_proj", "dense", "fc1", "fc2"] |
| | |
| | ``` |
| | { |
| | "epoch": 0.060952845355012227, |
| | "total_flos": 1.04520375926784e+18, |
| | "train_loss": 0.5059917987585068, |
| | "train_runtime": 72742.6851, |
| | "train_samples": 1049992, |
| | "train_samples_per_second": 0.88, |
| | "train_steps_per_second": 0.027 |
| | } |
| | ``` |
| | |
| |
|
| | # Training script: https://github.com/ao9000/bias-bench/blob/main/experiments/run_clm.py |
| | |
| | |