| library_name: transformers | |
| license: apache-2.0 | |
| datasets: | |
| - schneewolflabs/Athanorlite-DPO | |
| base_model: | |
| - nbeerbower/Schreiber-mistral-nemo-12B | |
| # Merlina-ORPO-12B | |
| This is the same training run as [schneewolflabs/A0l-12B](https://huggingface.co/schneewolflabs/A0l-12B) but with a custom ORPO implementation and `beta=0.1`. | |
|  | |