File size: 3,783 Bytes
fa76969
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
---
library_name: transformers
license: mit
base_model: microsoft/DialoGPT-small
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dialochess-v4
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dialochess-v4

This model is a fine-tuned version of [microsoft/DialoGPT-small](https://huggingface.co/microsoft/DialoGPT-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8124
- Accuracy: 0.0004

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 6
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 1.3141        | 0.1616 | 200  | 1.1921          | 0.0002   |
| 1.194         | 0.3231 | 400  | 1.0974          | 0.0003   |
| 1.118         | 0.4847 | 600  | 1.0425          | 0.0      |
| 1.0767        | 0.6462 | 800  | 1.0071          | 0.0      |
| 1.0414        | 0.8078 | 1000 | 0.9825          | 0.0005   |
| 1.0158        | 0.9693 | 1200 | 0.9601          | 0.0002   |
| 1.0035        | 1.1309 | 1400 | 0.9427          | 0.0001   |
| 0.9715        | 1.2924 | 1600 | 0.9300          | 0.0002   |
| 0.9745        | 1.4540 | 1800 | 0.9193          | 0.0002   |
| 0.9447        | 1.6155 | 2000 | 0.9063          | 0.0002   |
| 0.9573        | 1.7771 | 2200 | 0.8980          | 0.0005   |
| 0.9386        | 1.9386 | 2400 | 0.8893          | 0.0003   |
| 0.9204        | 2.1002 | 2600 | 0.8786          | 0.0003   |
| 0.9128        | 2.2617 | 2800 | 0.8732          | 0.0003   |
| 0.9079        | 2.4233 | 3000 | 0.8670          | 0.0002   |
| 0.9073        | 2.5848 | 3200 | 0.8603          | 0.0002   |
| 0.8938        | 2.7464 | 3400 | 0.8532          | 0.0004   |
| 0.8899        | 2.9079 | 3600 | 0.8501          | 0.0002   |
| 0.8834        | 3.0695 | 3800 | 0.8426          | 0.0002   |
| 0.8693        | 3.2310 | 4000 | 0.8416          | 0.0003   |
| 0.8808        | 3.3926 | 4200 | 0.8335          | 0.0002   |
| 0.872         | 3.5541 | 4400 | 0.8297          | 0.0003   |
| 0.8689        | 3.7157 | 4600 | 0.8296          | 0.0003   |
| 0.8607        | 3.8772 | 4800 | 0.8237          | 0.0002   |
| 0.8516        | 4.0388 | 5000 | 0.8246          | 0.0004   |
| 0.8652        | 4.2003 | 5200 | 0.8210          | 0.0004   |
| 0.8522        | 4.3619 | 5400 | 0.8192          | 0.0003   |
| 0.8466        | 4.5234 | 5600 | 0.8181          | 0.0002   |
| 0.8525        | 4.6850 | 5800 | 0.8163          | 0.0004   |
| 0.8485        | 4.8465 | 6000 | 0.8163          | 0.0004   |
| 0.8444        | 5.0081 | 6200 | 0.8144          | 0.0003   |
| 0.8512        | 5.1696 | 6400 | 0.8141          | 0.0004   |
| 0.8405        | 5.3312 | 6600 | 0.8135          | 0.0004   |
| 0.8337        | 5.4927 | 6800 | 0.8124          | 0.0004   |
| 0.8601        | 5.6543 | 7000 | 0.8125          | 0.0004   |
| 0.8506        | 5.8158 | 7200 | 0.8124          | 0.0004   |
| 0.8562        | 5.9774 | 7400 | 0.8124          | 0.0004   |


### Framework versions

- Transformers 4.57.2
- Pytorch 2.9.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.1