File size: 119 Bytes
dd39b65
 
 
 
 
 
1
2
3
4
5
6
7
{
    "lora_r": 4,
    "lora_alpha": 16,
    "lora_dropout": 0.1,
    "target_modules": ["attn2.to_q", "attn2.to_v"]
}