|
|
--- |
|
|
license: cc-by-nc-sa-4.0 |
|
|
language: |
|
|
- en |
|
|
- ja |
|
|
- nl |
|
|
- de |
|
|
- zh |
|
|
base_model: |
|
|
- mistralai/Mistral-Nemo-Instruct-2407 |
|
|
library_name: transformers |
|
|
--- |
|
|
## Hypernova-2x12B-exp |
|
|
First time working with Mixture of Experts models |
|
|
|
|
|
I have not had the time to try this model yet, but it took too much time and resources to not upload it |
|
|
|
|
|
I am planning on improving this model in the future, so feedback on this model is welcomed |
|
|
|
|
|
This model is mainly focused on RP and may produce NSFW content |
|
|
|
|
|
It is a 2x12B model that loads like a 12B but uses both experts somehow |
|
|
|
|
|
Get the GGUFs here: [theNovaAI/Hypernova-2x12B-exp-GGUF](https://huggingface.co/theNovaAI/Hypernova-2x12B-exp-GGUF) |