Mismatch between config.json (bf16) and actual weight dtype (fp16)
Hi, thanks for sharing this model!
I noticed a small discrepancy regarding the precision type.
The config.json specifies "torch_dtype": "bfloat16", but the actual tensors in the .safetensors files seem to be in float16.(While Tensor Type on huggingface is the correct "float16")
Thank you for pointing this out! We have corrected the torch_dtype field in config.json from bfloat16 to float16 and re-uploaded the updated config.json. We have confirmed that the previous configuration does not affect normal usage, but we made this correction to avoid any confusion. Thank you again for your careful feedback — please feel free to reach out if you notice anything else or have suggestions.
Thanks for the suggestion! The current tensor type shown on the Hugging Face page is already aligned with the model, so things should be in good shape now. Really appreciate you pointing this out.
