rajpurkar/squad_v2
Viewer • Updated • 142k • 37.4k • 251
How to use jasoneden/bloom560m-squad-helloworld with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="jasoneden/bloom560m-squad-helloworld") # Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("jasoneden/bloom560m-squad-helloworld")
model = AutoModelForQuestionAnswering.from_pretrained("jasoneden/bloom560m-squad-helloworld")This model is a fine-tuned version of bigscience/bloom-560m on the squad_v2 dataset. It is intended for a proof of concept, and perhaps to serve as a starting point for others trying to do the same thing.
Ongoing discussion surrounding this effort:
https://huggingface.co/bigscience/bloom/discussions/46#633c57b2ccce04161f82e6c2
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Base model
bigscience/bloom-560m