Palmyra (Writer license)
Collection
Palmyra LLMs under Writer license https://writer.com/legal/open-model-license/ • 10 items • Updated • 6
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="Writer/Palmyra-56B-Instruct")
messages = [
{"role": "user", "content": "Who are you?"},
]
pipe(messages)# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Writer/Palmyra-56B-Instruct")
model = AutoModelForCausalLM.from_pretrained("Writer/Palmyra-56B-Instruct")
messages = [
{"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to(model.device)
outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))Palmyra-56B-Instruct is a model built by Writer specifically to meet the needs for better reasoning model.
[More Information Needed]
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
Use the code below to get started with the model.
[More Information Needed]
[More Information Needed]
[More Information Needed]
# Gated model: Login with a HF token with gated access permission hf auth login