Text Generation
Transformers
Safetensors
mistral
axolotl
finetune
roleplaying
RP
Mistral
conversational
text-generation-inference
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Delta-Vector/Francois-PE-V2")
model = AutoModelForCausalLM.from_pretrained("Delta-Vector/Francois-PE-V2")
messages = [
{"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to(model.device)
outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))Quick Links
Francois-PE
This is the base model for Francois-PE-Huali, I'd reccomend using that instead of this, The model is extremely underfit and via KTO - was fixed and wayyyyyyyyyyy more coherent
List of datasets:
datasets:
- PocketDoc/Dans-Personamaxx-VN
- NewEden/LIMARP-Complexity
- NewEden/PIPPA-Mega-Filtered
- NewEden/OpenCAI-ShareGPT
- NewEden/Creative_Writing-Complexity
- NewEden/Light-Novels-Roleplay-Logs-Books-Oh-My-duplicate-turns-removed
- PocketDoc/Dans-Failuremaxx-Adventure-3
- NewEden/Books-V2-ShareGPT
- NewEden/Deepseek-V3-RP-Filtered
- NewEden/BlueSky-10K-Complexity
- NewEden/Final-Alpindale-LNs-ShareGPT
- NewEden/DeepseekRP-Filtered
- NewEden/RP-logs-V2-Experimental
- anthracite-org/kalo_opus_misc_240827
- anthracite-org/kalo_misc_part2
- NewEden/vanilla-backrooms-claude-sharegpt
- NewEden/Storium-Prefixed-Clean
- NewEden/KTO-IF-Dans
- NewEden/KTO-Instruct-Mix
- NewEden/Opus-accepted-hermes-rejected-shuffled
KTO wandb: https://wandb.ai/new-eden/Francois-V2/runs/f2qejmu0?nw=nwuserdeltavector SFT wandb: https://wandb.ai/new-eden/Francois/runs/sn3utrs1?nw=nwuserdeltavector
- Downloads last month
- 14
Model tree for Delta-Vector/Francois-PE-V2
Base model
mistralai/Mistral-Nemo-Base-2407
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Delta-Vector/Francois-PE-V2") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)