Instructions to use PoseyATX/Humiliated_Dolphin with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use PoseyATX/Humiliated_Dolphin with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="PoseyATX/Humiliated_Dolphin")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("PoseyATX/Humiliated_Dolphin") model = AutoModelForSeq2SeqLM.from_pretrained("PoseyATX/Humiliated_Dolphin") - Notebooks
- Google Colab
- Kaggle
Update config.json
Browse files- config.json +1 -1
config.json
CHANGED
|
@@ -38,7 +38,7 @@
|
|
| 38 |
"LABEL_2": 2
|
| 39 |
},
|
| 40 |
"length_penalty": 0.6,
|
| 41 |
-
"max_length":
|
| 42 |
"max_position_embeddings": 1024,
|
| 43 |
"min_length": 32,
|
| 44 |
"model_type": "pegasus",
|
|
|
|
| 38 |
"LABEL_2": 2
|
| 39 |
},
|
| 40 |
"length_penalty": 0.6,
|
| 41 |
+
"max_length": 1024,
|
| 42 |
"max_position_embeddings": 1024,
|
| 43 |
"min_length": 32,
|
| 44 |
"model_type": "pegasus",
|