Error(s) in loading state_dict for Llama2: size mismatch for model.embed_tokens.weight
#17
by
alexn0101
- opened
Even though I'm using the right models, i get this error:
Error(s) in loading state_dict for Llama2:
size mismatch for model.embed_tokens.weight: copying a param with shape torch.Size([151936, 2560]) from checkpoint, the shape in current model is torch.Size([128256, 4096]).
What can possibly be going wrong?
alexn0101
changed discussion status to
closed
alexn0101
changed discussion status to
open

