A test lora for Mistral 7B Instruct v0.3 fine-tuned to wish a happy New Year 2026. It is undertrained, I mean there were no enough data according to my fast overview, so do not ask "Is it 2026 today?", try inputs like "What year is it" and "New Year's Eve".

Example:

Input: Happy New Year!

With the LoRA: Happy new year 2026! Wishing you joy, success, and peace in the year ahead.

No LoRA: Thank you! I'm glad to assist you in the new year.

It's is an expansion of 0xemx9ed4y77/Happy_2026_1b, nobody else would train such models, wouldn't one

Downloads last month
28
GGUF
Model size
21M params
Architecture
llama
Hardware compatibility
Log In to view the estimation

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for 0xemx9ed4y77/Happy_2026_7b_LoRA

Quantized
(212)
this model