File size: 542 Bytes
fcc6927 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
license: mit
---
Lora finetuning on Wikipedia-10, applying counter factual data augmentation (CDA)
Dataset: Wikipedia-10
Target modules = ["q_proj", "k_proj", "v_proj", "dense", "fc1", "fc2"]
```
{
"epoch": 0.22326645805206993,
"total_flos": 1.04520375926784e+18,
"train_loss": 2.3526821104288103,
"train_runtime": 66080.1443,
"train_samples": 286653,
"train_samples_per_second": 0.969,
"train_steps_per_second": 0.03
}
```
Training script: https://github.com/ao9000/bias-bench/blob/main/experiments/run_clm.py |