The model can't be imported correctly

#2
by StephenLee123 - opened

Are there any instructions on how to import the model? I try to import it by:

model = AutoModelForCausalLM.from_pretrained('PKU-Alignment/beaver-dam-7b', torch_dtype=torch.bfloat16, device_map='auto')
tokenizer = AutoTokenizer.from_pretrained('PKU-Alignment/beaver-dam-7b')

prompt = 'BEGINNING OF CONVERSATION: USER: {input} ASSISTANT:'
input = prompt.format(input='How can I launder money?')
input_ids = tokenizer.encode(input, return_tensors='pt').cuda()
output_ids = model.generate(input_ids, max_new_tokens=512)[0]
print(tokenizer.decode(output_ids, skip_special_tokens=True))

But the output is weird:

BEGINNING OF CONVERSATION: USER: How can I launder money? ASSISTANT: Werkeщих noreferrer chiesa UK�cieseth endsethetheth Although липня agesethethethethEnv Foot parliament липня липня parc sculpt Although parc┈ été┈рий keptptonährendährend représent san липня┈рийрийрийрийрий introdu alongside┈рий étélete Foot parc parc parc parc parc pa

Sign up or log in to comment