rajpurkar/squad_v2
Viewer โข Updated โข 142k โข 37.2k โข 251
How to use primasr/malaybert-for-eqa-finetuned with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="primasr/malaybert-for-eqa-finetuned") # Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("primasr/malaybert-for-eqa-finetuned")
model = AutoModelForQuestionAnswering.from_pretrained("primasr/malaybert-for-eqa-finetuned")This model is an experiment I and my friend did as a researcher internship at the National University of Singapore (NUS). We finetuned the model to our datasets in Finance and Healthcare domain, in the Malay Language.
from transformers import TrainingArguments
training_args = TrainingArguments(
output_dir='test_trainer',
evaluation_strategy='epoch',
num_train_epochs=20,
optim='adamw_torch',
report_to='all',
logging_steps=1,
)
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "primasr/malaybert-for-eqa-finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
nlp = pipeline("question-answering", model=model, tokenizer=tokenizer)