Qwen 2.5-7B Physics-5k

Warning:

  • This model is experimental
  • Based on the tests i have done, the model tends to halucinate a lot, fact check everything. An improved version of the model is being worked on.

Training

The model was trained on 5k lines of the database camelai/physics.

All database credit to them.

Training parameters

r = 16

alpha = 32

Epochs: 1

Max sequence length: 2048

Official quantizations

Name Quant Method Bits Size Description
qwen-2.5-physics.Q4_K_M.gguf Q4_K_M 4 4.68 GB
qwen-2.5-physics.Q6_K.gguf Q6_K 6 6.25 GB
qwen-2.5-physics.Q8_0.gguf Q8_0 8 8.1 GB

Further quantization may be coming soon.

Downloads last month
60
Safetensors
Model size
8B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for vaclavak/qwen2.5-physics

Base model

Qwen/Qwen2.5-7B
Finetuned
(3173)
this model
Quantizations
3 models

Dataset used to train vaclavak/qwen2.5-physics

Collection including vaclavak/qwen2.5-physics