File size: 1,206 Bytes
62abf22 b355b68 a2f41e2 6d4b7d7 a2f41e2 b355b68 a2f41e2 b355b68 a2f41e2 b355b68 a2f41e2 b355b68 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
base_model:
- Delta-Vector/MS3.2-Austral-24B-SFT
---
# What is this
This the KTO checkpoint of my MS3.2 Austral winton train. Use the MS3.2 Winton train for the best experience.
wandb: https://wandb.ai/new-eden/austral/runs/2iaj6moy?nw=nwuserdeltavector
Datasets:
```
datasets:
- path: Delta-Vector/Tauri-IFeval-Dans-Tulu-KTO
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-Opus-accepted-hermes-rejected-shuffled
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-Opus-Accepted-GPT-Rejected-Opus-Writing-Prompts
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-Helpsteer3-Edit
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-Helpsteer-3-Preference-KTO
split: train
type: chatml.argilla
- path: NewEden/Purpura-Arkhaios-CC-KTO
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-KTO-Instruct-Mix
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-LIT-RL-KTO
split: train
type: chatml.argilla
- path: Delta-Vector/Tauri-Synth-1-KTO-R1-No-Think
split: train
type: chatml.argilla
```
Trained on 8xA100s using Axolotl. Ty to my work & Auri <3 |