File size: 4,422 Bytes
d3a7c67
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
library_name: transformers
language:
- jpn
license: mit
base_model: pyannote/segmentation-3.0
tags:
- speaker-diarization
- speaker-segmentation
- generated_from_trainer
datasets:
- diarizers-community/synthetic-speaker-diarization-dataset
model-index:
- name: synthetic-speaker-jpn
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# synthetic-speaker-jpn

This model is a fine-tuned version of [pyannote/segmentation-3.0](https://huggingface.co/pyannote/segmentation-3.0) on the diarizers-community/synthetic-speaker-diarization-dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3546
- Model Preparation Time: 0.0018
- Der: 0.1098
- False Alarm: 0.0178
- Missed Detection: 0.0198
- Confusion: 0.0722

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Der    | False Alarm | Missed Detection | Confusion |
|:-------------:|:-----:|:----:|:---------------:|:----------------------:|:------:|:-----------:|:----------------:|:---------:|
| 0.4222        | 1.0   | 198  | 0.4012          | 0.0018                 | 0.1316 | 0.0195      | 0.0240           | 0.0880    |
| 0.3767        | 2.0   | 396  | 0.3893          | 0.0018                 | 0.1267 | 0.0176      | 0.0237           | 0.0854    |
| 0.3784        | 3.0   | 594  | 0.3935          | 0.0018                 | 0.1233 | 0.0172      | 0.0232           | 0.0829    |
| 0.3596        | 4.0   | 792  | 0.3747          | 0.0018                 | 0.1216 | 0.0192      | 0.0204           | 0.0820    |
| 0.352         | 5.0   | 990  | 0.3807          | 0.0018                 | 0.1231 | 0.0184      | 0.0207           | 0.0840    |
| 0.3111        | 6.0   | 1188 | 0.3585          | 0.0018                 | 0.1134 | 0.0183      | 0.0203           | 0.0748    |
| 0.3139        | 7.0   | 1386 | 0.3460          | 0.0018                 | 0.1123 | 0.0181      | 0.0202           | 0.0740    |
| 0.3176        | 8.0   | 1584 | 0.3610          | 0.0018                 | 0.1134 | 0.0184      | 0.0198           | 0.0752    |
| 0.3142        | 9.0   | 1782 | 0.3542          | 0.0018                 | 0.1127 | 0.0172      | 0.0211           | 0.0745    |
| 0.2834        | 10.0  | 1980 | 0.3485          | 0.0018                 | 0.1116 | 0.0178      | 0.0201           | 0.0737    |
| 0.2875        | 11.0  | 2178 | 0.3537          | 0.0018                 | 0.1095 | 0.0174      | 0.0204           | 0.0717    |
| 0.2704        | 12.0  | 2376 | 0.3582          | 0.0018                 | 0.1111 | 0.0177      | 0.0201           | 0.0733    |
| 0.2802        | 13.0  | 2574 | 0.3589          | 0.0018                 | 0.1106 | 0.0177      | 0.0200           | 0.0728    |
| 0.2577        | 14.0  | 2772 | 0.3547          | 0.0018                 | 0.1102 | 0.0180      | 0.0198           | 0.0725    |
| 0.261         | 15.0  | 2970 | 0.3511          | 0.0018                 | 0.1086 | 0.0181      | 0.0196           | 0.0709    |
| 0.2647        | 16.0  | 3168 | 0.3544          | 0.0018                 | 0.1096 | 0.0182      | 0.0194           | 0.0719    |
| 0.2554        | 17.0  | 3366 | 0.3537          | 0.0018                 | 0.1093 | 0.0174      | 0.0202           | 0.0717    |
| 0.2624        | 18.0  | 3564 | 0.3547          | 0.0018                 | 0.1095 | 0.0178      | 0.0199           | 0.0718    |
| 0.2667        | 19.0  | 3762 | 0.3542          | 0.0018                 | 0.1098 | 0.0178      | 0.0198           | 0.0722    |
| 0.2613        | 20.0  | 3960 | 0.3546          | 0.0018                 | 0.1098 | 0.0178      | 0.0198           | 0.0722    |


### Framework versions

- Transformers 4.50.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1