File size: 4,524 Bytes
51e63e3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
---
base_model: google/pegasus-large
tags:
- generated_from_trainer
metrics:
- rouge
- bleu
model-index:
- name: Physical_Principal_PegasusLargeModel
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Physical_Principal_PegasusLargeModel

This model is a fine-tuned version of [google/pegasus-large](https://huggingface.co/google/pegasus-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 5.5512
- Rouge1: 42.7556
- Rouge2: 11.8875
- Rougel: 28.895
- Rougelsum: 39.3949
- Bertscore Precision: 78.8849
- Bertscore Recall: 81.4458
- Bertscore F1: 80.1383
- Bleu: 0.0723
- Gen Len: 195.6218

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum | Bertscore Precision | Bertscore Recall | Bertscore F1 | Bleu   | Gen Len  |
|:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------------------:|:----------------:|:------------:|:------:|:--------:|
| 6.8945        | 0.0620 | 100  | 6.5742          | 31.5625 | 6.7644  | 21.3466 | 29.564    | 75.9351             | 79.1872          | 77.5195      | 0.0377 | 195.6218 |
| 6.3686        | 0.1239 | 200  | 6.2216          | 36.3235 | 9.5662  | 25.3803 | 33.4451   | 76.783              | 80.08            | 78.3873      | 0.0551 | 195.6218 |
| 6.1776        | 0.1859 | 300  | 6.0881          | 38.1424 | 10.5393 | 26.1916 | 35.0327   | 77.3676             | 80.4951          | 78.892       | 0.0625 | 195.6218 |
| 6.1663        | 0.2478 | 400  | 5.9817          | 39.7408 | 10.7982 | 26.9356 | 36.7781   | 77.9473             | 80.6841          | 79.2852      | 0.0636 | 195.6218 |
| 6.0978        | 0.3098 | 500  | 5.8917          | 39.3921 | 10.6543 | 26.9539 | 36.4703   | 77.9172             | 80.7242          | 79.289       | 0.0631 | 195.6218 |
| 5.9824        | 0.3717 | 600  | 5.8200          | 42.4464 | 11.3324 | 27.8739 | 39.2401   | 78.5855             | 81.0563          | 79.7957      | 0.0669 | 195.6218 |
| 5.9387        | 0.4337 | 700  | 5.7582          | 41.98   | 11.3435 | 28.0672 | 38.7014   | 78.4184             | 81.1429          | 79.7509      | 0.0688 | 195.6218 |
| 5.8692        | 0.4957 | 800  | 5.7002          | 41.8091 | 11.4629 | 28.0863 | 38.4041   | 78.2676             | 81.1789          | 79.6896      | 0.0691 | 195.6218 |
| 5.8287        | 0.5576 | 900  | 5.6638          | 42.0986 | 11.4605 | 28.3743 | 38.792    | 78.3915             | 81.233           | 79.7799      | 0.0691 | 195.6218 |
| 5.8113        | 0.6196 | 1000 | 5.6285          | 41.7907 | 11.4896 | 28.4144 | 38.688    | 78.6019             | 81.2325          | 79.889       | 0.0691 | 195.6218 |
| 5.788         | 0.6815 | 1100 | 5.6124          | 42.7557 | 11.8057 | 28.7632 | 39.4786   | 78.7952             | 81.3582          | 80.0499      | 0.0709 | 195.6218 |
| 5.7594        | 0.7435 | 1200 | 5.5892          | 42.8952 | 11.8442 | 28.8255 | 39.5519   | 78.7779             | 81.3894          | 80.0556      | 0.0716 | 195.6218 |
| 5.7829        | 0.8055 | 1300 | 5.5713          | 42.9309 | 11.8742 | 28.8596 | 39.5816   | 78.8398             | 81.4053          | 80.0955      | 0.0719 | 195.6218 |
| 5.7359        | 0.8674 | 1400 | 5.5603          | 42.5415 | 11.7179 | 28.8073 | 39.2871   | 78.8019             | 81.37            | 80.0586      | 0.0710 | 195.6218 |
| 5.7216        | 0.9294 | 1500 | 5.5546          | 43.1987 | 12.0116 | 29.0028 | 39.8203   | 78.9462             | 81.474           | 80.1838      | 0.0728 | 195.6218 |
| 5.6968        | 0.9913 | 1600 | 5.5512          | 42.7556 | 11.8875 | 28.895  | 39.3949   | 78.8849             | 81.4458          | 80.1383      | 0.0723 | 195.6218 |


### Framework versions

- Transformers 4.41.2
- Pytorch 2.1.2
- Datasets 2.19.2
- Tokenizers 0.19.1