Some weights of the model checkpoint at relaxml/Llama-2-7b-E8P-2Bit were not used when initializing LlamaForCausalLM

#4
by pkhara - opened

Some weights of the model checkpoint at relaxml/Llama-2-7b-E8P-2Bit were not used when initializing LlamaForCausalLM:

This IS expected if you are initializing LlamaForCausalLM from the checkpoint of a model trained on another task or with another architecture

This IS NOT expected if you are initializing LlamaForCausalLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model)

Some weights of LlamaForCausalLM were not initialized from the model checkpoint at relaxml/Llama-2-7b-E8P-2Bit and are newly initialized:

You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

How to solve this issue?

My code:

Load model directly

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("relaxml/Llama-2-7b-E8P-2Bit")
model = AutoModelForCausalLM.from_pretrained("relaxml/Llama-2-7b-E8P-2Bit")

@at676 Can you check the issue and please help here?

Sign up or log in to comment