webnlg-challenge/web_nlg
Updated • 2.1k • 24
How to use ZhanQU/conversion_overfit with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B")
model = PeftModel.from_pretrained(base_model, "ZhanQU/conversion_overfit")This model is a fine-tuned version of meta-llama/Meta-Llama-3.1-8B on the web_nlg dataset.
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Base model
meta-llama/Llama-3.1-8B