u-10bei/structured_data_with_cot_dataset_512_v2
Viewer • Updated • 3.93k • 146 • 1
How to use astom-M/lora-sft-v5 with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("unsloth/qwen3-4b-instruct-2507-unsloth-bnb-4bit")
model = PeftModel.from_pretrained(base_model, "astom-M/lora-sft-v5")LoRA adapter fine-tuned from Qwen/Qwen3-4B-Instruct-2507 using QLoRA (4-bit, Unsloth). LoRA adapter weights only - base model must be loaded separately.
Improve structured output accuracy (JSON / YAML / XML / TOML / CSV). Loss applied only to final assistant output (CoT masked).
Training data: u-10bei/structured_data_with_cot_dataset_512_v2 Dataset License: MIT License.
Base model
Qwen/Qwen3-4B-Instruct-2507