BERT
Collection
4 items • Updated
# Load model directly
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("Antonio49/ModeloCanal")
model = AutoModelForQuestionAnswering.from_pretrained("Antonio49/ModeloCanal")This modelcard aims to be a base template for new models. It has been generated using this raw template.
[More Information Needed]
[More Information Needed]
[More Information Needed]
[More Information Needed]
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
Use the code below to get started with the model.
[More Information Needed]
[More Information Needed]
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="Antonio49/ModeloCanal")