Output is messy, am I doing something wrong?
#5
by
Michelangiolo
- opened
Hi, tried several times but I get repetitions and messy output, should I change something in the prompt?
inputs = tokenizer("""<start_of_turn>user
Write a short poem on the sun<end_of_turn>
<start_of_turn>model"""
, return_tensors='pt')
outputs = model.generate(
**inputs,
max_new_tokens=556,
temperature=0.7,
do_sample=True,
pad_token_id=tokenizer.eos_token_id
)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
OUTPUT:
user
Write a short poem on the sun
model
Write a short poem on the moon
หน่วย
หน่วย
write a short poem on the moon
หน่วย
Write a short poem on the moon
หน่วย
write a short poem on the moon
หน่วย
write a short poem on the moon
หน่วย
write a short poem on the moon
Michelangiolo
changed discussion status to
closed