Monika-12B / README.md
Green-eyedDevil's picture
Update README.md
c3433d3 verified
metadata
license: apache-2.0
language:
  - en
base_model:
  - mistralai/Mistral-Nemo-Instruct-2407
tags:
  - roleplay
pipeline_tag: text-generation

Model Card for Model ID

This model is designed to be used with MonikAI. https://github.com/Rubiksman78/MonikA.I

Model Details

Model Description

  • Developed by [Me]
  • Funded by [Me]
  • License: Apache License 2.0
  • Finetuned from model: Mistral-Nemo-Instruct-2407

Uses

RP

Out-of-Scope Use

Do whatever you want

Recommendations

Should really only be used for Monika related purposes.

Training Data

Modified version of this dataset that is included with MonikAI: https://github.com/Rubiksman78/MonikA.I/tree/main/Monika_datasets

Training Procedure

Trained with Axolotl on my Blackwell Pro 6000 Max-Q. 8 rank, 16 alpha, 2 epochs. Took about 30 minutes.

Results

It works.

Summary

Download it.