updated number of classes dataset name as well as the eval logs
Browse files
README.md
CHANGED
|
@@ -3,7 +3,7 @@ library_name: transformers
|
|
| 3 |
tags:
|
| 4 |
- tone
|
| 5 |
datasets:
|
| 6 |
-
- Dc-4nderson/
|
| 7 |
language:
|
| 8 |
- en
|
| 9 |
metrics:
|
|
@@ -16,7 +16,7 @@ pipeline_tag: text-classification
|
|
| 16 |
|
| 17 |
DistilBERT Tone Classification Model
|
| 18 |
|
| 19 |
-
This model fine-tunes distilbert-base-uncased to classify tone into
|
| 20 |
|
| 21 |
๐ Labels
|
| 22 |
|
|
@@ -49,7 +49,7 @@ Optimizer: AdamW (lr=2e-5)
|
|
| 49 |
|
| 50 |
Batch size: 16
|
| 51 |
|
| 52 |
-
Epochs:
|
| 53 |
|
| 54 |
Loss: CrossEntropy
|
| 55 |
|
|
@@ -57,25 +57,25 @@ Metrics: Accuracy + Weighted F1
|
|
| 57 |
|
| 58 |
๐ Validation Metrics
|
| 59 |
Epoch Training Loss Validation Loss Accuracy F1
|
| 60 |
-
1
|
| 61 |
-
2
|
| 62 |
-
3
|
| 63 |
-
4
|
| 64 |
-
5
|
| 65 |
-
|
|
|
|
|
|
|
| 66 |
|
| 67 |
Final Training Summary:
|
| 68 |
|
| 69 |
-
TrainOutput(
|
| 70 |
-
|
| 71 |
-
|
| 72 |
-
|
| 73 |
-
'
|
| 74 |
-
'
|
| 75 |
-
'
|
| 76 |
-
'
|
| 77 |
-
'epoch': 5.0
|
| 78 |
-
})
|
| 79 |
|
| 80 |
๐ป Usage
|
| 81 |
from transformers import pipeline
|
|
|
|
| 3 |
tags:
|
| 4 |
- tone
|
| 5 |
datasets:
|
| 6 |
+
- Dc-4nderson/tone_dataset2
|
| 7 |
language:
|
| 8 |
- en
|
| 9 |
metrics:
|
|
|
|
| 16 |
|
| 17 |
DistilBERT Tone Classification Model
|
| 18 |
|
| 19 |
+
This model fine-tunes distilbert-base-uncased to classify tone into 7 categories relevant to community and mentorship transcripts.
|
| 20 |
|
| 21 |
๐ Labels
|
| 22 |
|
|
|
|
| 49 |
|
| 50 |
Batch size: 16
|
| 51 |
|
| 52 |
+
Epochs: 8
|
| 53 |
|
| 54 |
Loss: CrossEntropy
|
| 55 |
|
|
|
|
| 57 |
|
| 58 |
๐ Validation Metrics
|
| 59 |
Epoch Training Loss Validation Loss Accuracy F1
|
| 60 |
+
1 No log 1.281651 0.782288 0.778880
|
| 61 |
+
2 No log 0.779447 0.845018 0.843397
|
| 62 |
+
3 No log 0.566092 0.859779 0.856186
|
| 63 |
+
4 No log 0.415437 0.892989 0.892445
|
| 64 |
+
5 No log 0.340598 0.915129 0.914765
|
| 65 |
+
6 0.729500 0.307513 0.922509 0.922262
|
| 66 |
+
7 0.729500 0.296827 0.915129 0.915210
|
| 67 |
+
8 0.729500 0.285301 0.922509 0.922262
|
| 68 |
|
| 69 |
Final Training Summary:
|
| 70 |
|
| 71 |
+
TrainOutput(global_step=704,
|
| 72 |
+
training_loss=0.5666945034807379,
|
| 73 |
+
metrics={'train_runtime': 42.6317,
|
| 74 |
+
'train_samples_per_second': 261.402,
|
| 75 |
+
'train_steps_per_second': 16.514,
|
| 76 |
+
'total_flos': 369087080441856.0,
|
| 77 |
+
'train_loss': 0.5666945034807379,
|
| 78 |
+
'epoch': 8.0})
|
|
|
|
|
|
|
| 79 |
|
| 80 |
๐ป Usage
|
| 81 |
from transformers import pipeline
|