How to use SRDdev/QABERT-small with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="SRDdev/QABERT-small")
# Load model directly from transformers import AutoTokenizer, AutoModelForQuestionAnswering tokenizer = AutoTokenizer.from_pretrained("SRDdev/QABERT-small") model = AutoModelForQuestionAnswering.from_pretrained("SRDdev/QABERT-small")
When Training, how many iterations did you train it for?
Hey @kev703in ,At the time I had a very old machine and was not well aware about free GPUs so trained it for 3 epochs.But if you are using a GPU from kaggle or colab you can train it for 5-10 epochs depending on your choice.
3 epochs
5-10 epochs
Here is a helpful notebook : notebook
Thank you! very helpful
路 Sign up or log in to comment