Error while loading quantized models

#15
by kwojciechowski - opened

Error: Can't create a session. ERROR_CODE: 1, ERROR_MESSAGE: Deserialize tensor model.layers.23.attn.q_norm.layernorm.weight failed.Failed to load external data file ""model.onnx_data"", error: Out of bounds.
at me (ort.all.min.js:1802:10701)
at bi (ort.all.min.js:4615:21061

Is this an issue with ort onnx library or is it an issue with the onnx files?

The unquantized models works though.

The root cause is renaming model_*.extension to model.extension which corrupted model loading. My fault.

kwojciechowski changed discussion status to closed

Sign up or log in to comment