YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Load the tokenizer, model, and data collator

MODEL_NAME = "google/flan-t5-base" device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

if device.type == 'cuda': tokenizer = T5Tokenizer.from_pretrained(MODEL_NAME) model = T5ForConditionalGeneration.from_pretrained(MODEL_NAME, device_map="auto")

elif device.type == 'cpu': tokenizer = T5Tokenizer.from_pretrained(MODEL_NAME) model = T5ForConditionalGeneration.from_pretrained(MODEL_NAME)

#data_collator = DataCollatorForSeq2Seq(tokenizer=tokenizer, model=model)

data_collator = DataCollatorWithPadding(tokenizer=tokenizer) # do padding automatic according size of input_str

Definir configuracao LORA

lora_config = LoraConfig( task_type = TaskType.SEQ_2_SEQ_LM, r = 16, lora_alpha = 32, lora_dropout = 0.1, target_modules = ["q", "v"] )

model = get_peft_model(model, lora_config)

F1-score%

Domain

Laptop 78.285714

Restaurant 77.762846

book 54.166667

beauty 51.600000

toy 51.600000

pet 46.000000

grocery 44.000000

fashion 43.700000

home 41.583333

electronics 41.176471

absa_restaurant_domain.png

AVALIAR PARA CONFIRMAR

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support