where is weights

#6
by m1kk0n - opened

Some weights of Qwen3ForSequenceClassification were not initialized from the model checkpoint at zeroentropy/zerank-2 and are newly initialized: ['score.weight']

same here!

>>> model = CrossEncoder("zeroentropy/zerank-2", trust_remote_code=True)
Loading weights:   1%|                | 2/398 [00:00<00:00, 3876.44it/s, Materializing param=model.layers.0.input_layernorm.weight]
Loading weights: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 398/398 [00:00<00:00, 5486.03it/s, Materializing param=model.norm.weight]
Qwen3ForSequenceClassification LOAD REPORT from: zeroentropy/zerank-2
Key          | Status  | 
-------------+---------+-
score.weight | MISSING | 

Notes:
- MISSING       :those params were newly initialized because missing from the checkpoint. Consider training on your downstream task.

So the head gets random init...

Sign up or log in to comment