How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("question-answering", model="Den4ikAI/rubert-large-squad")
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("Den4ikAI/rubert-large-squad", dtype="auto")
Quick Links

обученный rubert от sberbank-ai/ruBert-base.

размер выборки - 4.

Эпохи - 2.

from transformers import pipeline
qa_pipeline = pipeline(
    "question-answering",
    model="Den4ikAI/rubert-large-squad",
    tokenizer="Den4ikAI/rubert-large-squad"
)
predictions = qa_pipeline({
    'context': "Пушкин родился 6 июля 1799 года",
    'question': "Когда родился Пушкин?"
})
print(predictions)
# output:
#{'score': 0.9613797664642334, 'start': 15, 'end': 31, 'answer': '6 июля 1799 года'}
Downloads last month
61
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support