IIC/livingner1
Viewer • Updated • 1.5k • 145
How to use IIC/mdeberta-v3-base-livingner1 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="IIC/mdeberta-v3-base-livingner1") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("IIC/mdeberta-v3-base-livingner1")
model = AutoModelForSequenceClassification.from_pretrained("IIC/mdeberta-v3-base-livingner1")This model is a finetuned version of mdeberta-v3-base for the livingner1 dataset used in a benchmark in the paper TODO. The model has a F1 of 0.953
Please refer to the original publication for more information TODO LINK
| parameter | Value |
|---|---|
| batch size | 16 |
| learning rate | 4e-05 |
| classifier dropout | 0.1 |
| warmup ratio | 0 |
| warmup steps | 0 |
| weight decay | 0 |
| optimizer | AdamW |
| epochs | 10 |
| early stopping patience | 3 |
TODO