How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Envoid/LBonVent-12B")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Envoid/LBonVent-12B")
model = AutoModelForCausalLM.from_pretrained("Envoid/LBonVent-12B")
Quick Links

Warning: This model may output adult content.

This model is the result of applying the Dendrite corpus to Undi95/Mistral-PetroLimaRP-v3-12B in a very deep lora (r=2048 deep).

I haven't had much time to interogate it but it seems pretty good at both RP and philosophical discussion.

Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Envoid/LBonVent-12B

Quantizations
2 models