| license: agpl-3.0 | |
| datasets: | |
| - lemonilia/LimaRP | |
| - grimulkan/theory-of-mind | |
| - Epiculous/Gnosis | |
| - ChaoticNeutrals/Synthetic-RP | |
| - ChaoticNeutrals/Synthetic-Dark-RP | |
|  | |
| # QuantFactory/Mika-7B-GGUF | |
| This is quantized version of [Epiculous/Mika-7B](https://huggingface.co/Epiculous/Mika-7B) created using llama.cpp | |
| # Original Model Card | |
| Mika (Named after what my Claude-3 Opus chat called itself.) is a Model trained in a similar manner to Fett-uccine with synthetic RP data created by Claude also included. | |
| ## Format | |
| I've had the best results with ChatML Context Template and Mistral Instruct Template, however, YMMV. | |