Dc-4nderson commited on
Commit
dc4329c
ยท
verified ยท
1 Parent(s): 3638c3f

updated number of classes dataset name as well as the eval logs

Browse files
Files changed (1) hide show
  1. README.md +19 -19
README.md CHANGED
@@ -3,7 +3,7 @@ library_name: transformers
3
  tags:
4
  - tone
5
  datasets:
6
- - Dc-4nderson/tone_dataset
7
  language:
8
  - en
9
  metrics:
@@ -16,7 +16,7 @@ pipeline_tag: text-classification
16
 
17
  DistilBERT Tone Classification Model
18
 
19
- This model fine-tunes distilbert-base-uncased to classify tone into 8 categories relevant to community and mentorship transcripts.
20
 
21
  ๐Ÿ“Œ Labels
22
 
@@ -49,7 +49,7 @@ Optimizer: AdamW (lr=2e-5)
49
 
50
  Batch size: 16
51
 
52
- Epochs: 5
53
 
54
  Loss: CrossEntropy
55
 
@@ -57,25 +57,25 @@ Metrics: Accuracy + Weighted F1
57
 
58
  ๐Ÿ“ˆ Validation Metrics
59
  Epoch Training Loss Validation Loss Accuracy F1
60
- 1 No log 1.260710 0.801242 0.784157
61
- 2 No log 0.777540 0.869565 0.869093
62
- 3 No log 0.577972 0.869565 0.868584
63
- 4 No log 0.481008 0.900621 0.900356
64
- 5 No log 0.452635 0.900621 0.900356
65
-
 
 
66
 
67
  Final Training Summary:
68
 
69
- TrainOutput(
70
- global_step=205,
71
- training_loss=0.8436699843988186,
72
- metrics={'train_runtime': 17.74,
73
- 'train_samples_per_second': 181.229,
74
- 'train_steps_per_second': 11.556,
75
- 'total_flos': 106480165436160.0,
76
- 'train_loss': 0.8436699843988186,
77
- 'epoch': 5.0
78
- })
79
 
80
  ๐Ÿ’ป Usage
81
  from transformers import pipeline
 
3
  tags:
4
  - tone
5
  datasets:
6
+ - Dc-4nderson/tone_dataset2
7
  language:
8
  - en
9
  metrics:
 
16
 
17
  DistilBERT Tone Classification Model
18
 
19
+ This model fine-tunes distilbert-base-uncased to classify tone into 7 categories relevant to community and mentorship transcripts.
20
 
21
  ๐Ÿ“Œ Labels
22
 
 
49
 
50
  Batch size: 16
51
 
52
+ Epochs: 8
53
 
54
  Loss: CrossEntropy
55
 
 
57
 
58
  ๐Ÿ“ˆ Validation Metrics
59
  Epoch Training Loss Validation Loss Accuracy F1
60
+ 1 No log 1.281651 0.782288 0.778880
61
+ 2 No log 0.779447 0.845018 0.843397
62
+ 3 No log 0.566092 0.859779 0.856186
63
+ 4 No log 0.415437 0.892989 0.892445
64
+ 5 No log 0.340598 0.915129 0.914765
65
+ 6 0.729500 0.307513 0.922509 0.922262
66
+ 7 0.729500 0.296827 0.915129 0.915210
67
+ 8 0.729500 0.285301 0.922509 0.922262
68
 
69
  Final Training Summary:
70
 
71
+ TrainOutput(global_step=704,
72
+ training_loss=0.5666945034807379,
73
+ metrics={'train_runtime': 42.6317,
74
+ 'train_samples_per_second': 261.402,
75
+ 'train_steps_per_second': 16.514,
76
+ 'total_flos': 369087080441856.0,
77
+ 'train_loss': 0.5666945034807379,
78
+ 'epoch': 8.0})
 
 
79
 
80
  ๐Ÿ’ป Usage
81
  from transformers import pipeline