File size: 630 Bytes
9edf356
 
 
235ad2d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: cc-by-nc-4.0
---
This is my first attempt at a qLoRA on Mixtral using the same raw text dataset as the Dendrite model on Mixtral 8x7B Instruct v0.1 with the following notable settings:
```
lora_r 256
lora_alpha 256
..._max_len 256
learning_rate 0.000001
num_train_epochs 2
```

The output is a little less dry but it still maintains the full level of functionality you would expect out of Mixtral instruct. And it still responds to the

```
[INST]
Do a thing
[/INST]
```

format. 
It's not a massive change in the output but I do plan to run a similar but larger data set with more epochs and a higher learning rate.