iRONNIE
Collection
"Understanding Irony through Explanations and Background Knowledge" + "Exploring the Transfer of Irony Explanation Generation from English to Dutch" • 8 items • Updated
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-70B-Instruct")
model = PeftModel.from_pretrained(base_model, "Amala3/IronyExplainer_mixed")This model is a fine-tuned version of meta-llama/Meta-Llama-3-70B-Instruct on the None dataset.
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Base model
meta-llama/Meta-Llama-3-70B
# Gated model: Login with a HF token with gated access permission hf auth login