# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("SkyAsl/Qwen-3-4B-Math_Solver", dtype="auto")Quick Links
Model Card for Model ID
Training Data
https://huggingface.co/datasets/meta-math/MetaMathQA
Training Hyperparameters
batch_size = 8, epoch = 1, learning_rate = 1e-4
Lora: r=16, lora_alpha=32, lora_dropout=0.05
Metrics
metrics={'train_runtime': 729.5559, 'train_samples_per_second': 9.746, 'train_steps_per_second': 0.306, 'total_flos': 7.949170591137792e+16, 'train_loss': 2.817356810976037, 'epoch': 1.0}
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="SkyAsl/Qwen-3-4B-Math_Solver") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)