# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("transformer3/check-model")
model = AutoModelForSeq2SeqLM.from_pretrained("transformer3/check-model")Quick Links
Model Trained Using AutoTrain
- Problem type: Summarization
- Model ID: 42255108399
- CO2 Emissions (in grams): 4.5448
Validation Metrics
- Loss: 2.320
- Rouge1: 23.126
- Rouge2: 13.707
- RougeL: 20.848
- RougeLsum: 20.928
- Gen Len: 19.987
Usage
You can use cURL to access this model:
$ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/transformer3/autotrain-finetune-42255108399
- Downloads last month
- 10
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="transformer3/check-model")