# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Envoid/LBonVent-12B")
model = AutoModelForCausalLM.from_pretrained("Envoid/LBonVent-12B")Quick Links
Warning: This model may output adult content.
This model is the result of applying the Dendrite corpus to Undi95/Mistral-PetroLimaRP-v3-12B in a very deep lora (r=2048 deep).
I haven't had much time to interogate it but it seems pretty good at both RP and philosophical discussion.
- Downloads last month
- 9
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Envoid/LBonVent-12B")