Summarization
Transformers
PyTorch
TensorBoard
English
bart
text2text-generation
Generated from Trainer
Instructions to use Quake24/easyTermsSummerizer with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Quake24/easyTermsSummerizer with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "summarization" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("summarization", model="Quake24/easyTermsSummerizer")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Quake24/easyTermsSummerizer") model = AutoModelForSeq2SeqLM.from_pretrained("Quake24/easyTermsSummerizer") - Notebooks
- Google Colab
- Kaggle
Adding Evaluation Results
#3
by leaderboard-pr-bot - opened
README.md
CHANGED
|
@@ -68,4 +68,17 @@ The following hyperparameters were used during training:
|
|
| 68 |
- Transformers 4.27.3
|
| 69 |
- Pytorch 1.13.0
|
| 70 |
- Datasets 2.1.0
|
| 71 |
-
- Tokenizers 0.13.2
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 68 |
- Transformers 4.27.3
|
| 69 |
- Pytorch 1.13.0
|
| 70 |
- Datasets 2.1.0
|
| 71 |
+
- Tokenizers 0.13.2
|
| 72 |
+
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
| 73 |
+
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Quake24__easyTermsSummerizer)
|
| 74 |
+
|
| 75 |
+
| Metric | Value |
|
| 76 |
+
|-----------------------|---------------------------|
|
| 77 |
+
| Avg. | 24.73 |
|
| 78 |
+
| ARC (25-shot) | 25.77 |
|
| 79 |
+
| HellaSwag (10-shot) | 25.81 |
|
| 80 |
+
| MMLU (5-shot) | 23.12 |
|
| 81 |
+
| TruthfulQA (0-shot) | 47.69 |
|
| 82 |
+
| Winogrande (5-shot) | 50.75 |
|
| 83 |
+
| GSM8K (5-shot) | 0.0 |
|
| 84 |
+
| DROP (3-shot) | 0.01 |
|