miscovery/arabic_egypt_english_world_facts
Viewer • Updated • 11.8k • 26 • 13
How to use miscovery/model with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "summarization" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("summarization", model="miscovery/model") # Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("miscovery/model", dtype="auto")This model is a transformer-based encoder-decoder model for multiple NLP tasks:
The model was trained in two stages:
pip install miscovery-model
from miscovery_model import standard_pipeline
# Create a pipeline
model = standard_pipeline("miscovery/model")
# Use it
result = model("Translate this to Arabic: What year did World War I begin?")
print(result)
This model was trained on specific datasets and may not generalize well to all domains.