| license: apache-2.0 | |
| I'm back! :D | |
| A mergekit made MoE with all Apache licenses, so this lil guy is commercially usable, unlike my prior models. | |
| Models used: | |
| - cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser | |
| - lvkaokao/mistral-7b-finetuned-orca-dpo-v2 | |
| - Herman555/Hexoteric-AshhLimaRP-Mistral-7B-GGUF | |
| - jondurbin/bagel-dpo-7b-v0.1 |