Commit ·
8d8a4e7
1
Parent(s): 2aa4505
Add detailed results and logs.
Browse files- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/lid_inference_test.log +300 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/results +1039 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/lid_inference_test.log +286 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/results +946 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang/lid_inference_test.log +302 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang/results +0 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang/lid_inference_test.log +280 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang/results +126 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang/lid_inference_test.log +356 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang/results +0 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang/lid_inference_test.log +295 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang/results +197 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/lid_inference_test.log +300 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.1.log +390 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.2.log +441 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.3.log +460 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.4.log +0 -0
- exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.log +388 -0
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/lid_inference_test.log
ADDED
|
@@ -0,0 +1,300 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 02:37:15 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 02:37:35,038 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 02:37:46,607 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 02:37:47,156 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 02:38:41,828 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 02:39:27,483 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 1
|
| 278 |
+
[gpue04] 2025-06-02 02:40:15,909 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 2
|
| 279 |
+
[gpue04] 2025-06-02 02:41:08,571 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 3
|
| 280 |
+
[gpue04] 2025-06-02 02:41:56,182 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 4
|
| 281 |
+
[gpue04] 2025-06-02 02:42:40,736 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 5
|
| 282 |
+
[gpue04] 2025-06-02 02:43:27,814 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 6
|
| 283 |
+
[gpue04] 2025-06-02 02:44:10,740 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 7
|
| 284 |
+
[gpue04] 2025-06-02 02:44:52,065 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 8
|
| 285 |
+
[gpue04] 2025-06-02 02:45:40,635 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 9
|
| 286 |
+
[gpue04] 2025-06-02 02:46:28,394 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 10
|
| 287 |
+
[gpue04] 2025-06-02 02:47:09,502 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 11
|
| 288 |
+
[gpue04] 2025-06-02 02:47:59,978 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 12
|
| 289 |
+
[gpue04] 2025-06-02 02:48:52,866 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 13
|
| 290 |
+
[gpue04] 2025-06-02 02:49:41,279 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 14
|
| 291 |
+
[gpue04] 2025-06-02 02:50:32,817 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 15
|
| 292 |
+
[gpue04] 2025-06-02 02:51:20,444 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 16
|
| 293 |
+
[gpue04] 2025-06-02 02:52:09,714 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 17
|
| 294 |
+
[gpue04] 2025-06-02 02:52:55,108 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 18
|
| 295 |
+
[gpue04] 2025-06-02 02:53:50,212 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 19
|
| 296 |
+
[gpue04] 2025-06-02 02:54:31,533 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 20
|
| 297 |
+
[gpue04] 2025-06-02 02:55:19,223 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 298 |
+
[gpue04] 2025-06-02 02:55:19,224 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 299 |
+
# Accounting: time=1085 threads=1
|
| 300 |
+
# Ended (code 0) at Mon Jun 2 02:55:20 CDT 2025, elapsed time 1085 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/results
ADDED
|
@@ -0,0 +1,1039 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Accuracy: 95.38%
|
| 2 |
+
Macro Accuracy: 95.44%
|
| 3 |
+
Accuracy per Language:
|
| 4 |
+
amh: 95.81%
|
| 5 |
+
tam: 96.86%
|
| 6 |
+
luo: 95.22%
|
| 7 |
+
tpi: 88.69%
|
| 8 |
+
ibo: 96.11%
|
| 9 |
+
ben: 92.18%
|
| 10 |
+
lao: 97.22%
|
| 11 |
+
swa: 90.31%
|
| 12 |
+
asm: 97.21%
|
| 13 |
+
gug: 95.29%
|
| 14 |
+
kaz: 94.59%
|
| 15 |
+
pus: 98.80%
|
| 16 |
+
tgl: 95.10%
|
| 17 |
+
hat: 97.81%
|
| 18 |
+
jav: 91.76%
|
| 19 |
+
zul: 97.44%
|
| 20 |
+
vie: 96.39%
|
| 21 |
+
kmr: 94.60%
|
| 22 |
+
kat: 96.10%
|
| 23 |
+
tur: 98.39%
|
| 24 |
+
ceb: 91.89%
|
| 25 |
+
yue: 100.00%
|
| 26 |
+
khk: 94.71%
|
| 27 |
+
lit: 98.77%
|
| 28 |
+
tel: 94.88%
|
| 29 |
+
Key: amh_18766_A_20140725_193025_031291, Target: amh, Predicted: tpi
|
| 30 |
+
Key: amh_16601_A_20140616_191918_057010, Target: amh, Predicted: tel
|
| 31 |
+
Key: amh_19621_B_20140517_232031_046443, Target: amh, Predicted: pus
|
| 32 |
+
Key: amh_19782_A_20140702_230513_056385, Target: amh, Predicted: tel
|
| 33 |
+
Key: amh_41741_A_20140422_000845_000000, Target: amh, Predicted: kat
|
| 34 |
+
Key: amh_46625_B_20140414_224528_000000, Target: amh, Predicted: tgl
|
| 35 |
+
Key: amh_46625_B_20140414_224528_011634, Target: amh, Predicted: tel
|
| 36 |
+
Key: amh_47799_A_20140902_200301_004751, Target: amh, Predicted: khk
|
| 37 |
+
Key: amh_41741_A_20140422_000845_021144, Target: amh, Predicted: ben
|
| 38 |
+
Key: amh_44961_A_20140421_215913_034469, Target: amh, Predicted: tel
|
| 39 |
+
Key: amh_42883_A_20140823_230118_001930, Target: amh, Predicted: asm
|
| 40 |
+
Key: amh_44961_A_20140421_215913_040626, Target: amh, Predicted: tur
|
| 41 |
+
Key: amh_44961_A_20140421_215913_048405, Target: amh, Predicted: tel
|
| 42 |
+
Key: amh_47799_A_20140902_200301_028605, Target: amh, Predicted: ibo
|
| 43 |
+
Key: amh_61011_B_20140415_180846_022820, Target: amh, Predicted: tpi
|
| 44 |
+
Key: amh_60498_A_20140823_192847_039762, Target: amh, Predicted: kaz
|
| 45 |
+
Key: amh_69633_A_20140607_233440_058823, Target: amh, Predicted: tgl
|
| 46 |
+
Key: amh_69633_A_20140607_233440_001199, Target: amh, Predicted: tgl
|
| 47 |
+
Key: amh_64870_A_20140518_011602_000000, Target: amh, Predicted: pus
|
| 48 |
+
Key: amh_69633_A_20140607_233440_027832, Target: amh, Predicted: kaz
|
| 49 |
+
Key: amh_69633_A_20140607_233440_052352, Target: amh, Predicted: ceb
|
| 50 |
+
Key: amh_69633_A_20140607_233440_054267, Target: amh, Predicted: tel
|
| 51 |
+
Key: amh_85439_A_20140814_215435_004561, Target: amh, Predicted: kmr
|
| 52 |
+
Key: amh_81553_A_20140707_003952_001198, Target: amh, Predicted: kmr
|
| 53 |
+
Key: amh_73757_A_20140512_231155_058241, Target: amh, Predicted: ibo
|
| 54 |
+
Key: amh_89888_B_20140520_191659_037281, Target: amh, Predicted: ben
|
| 55 |
+
Key: amh_93320_A_20140823_214255_040946, Target: amh, Predicted: swa
|
| 56 |
+
Key: amh_93320_A_20140823_214255_050913, Target: amh, Predicted: tel
|
| 57 |
+
Key: amh_89888_B_20140520_191659_020529, Target: amh, Predicted: kaz
|
| 58 |
+
Key: amh_89888_B_20140520_191659_058819, Target: amh, Predicted: zul
|
| 59 |
+
Key: amh_95124_A_20140828_224047_058345, Target: amh, Predicted: khk
|
| 60 |
+
Key: amh_95124_A_20140828_224047_059534, Target: amh, Predicted: gug
|
| 61 |
+
Key: amh_95124_A_20140828_224047_022900, Target: amh, Predicted: kat
|
| 62 |
+
Key: amh_95124_A_20140828_224047_034153, Target: amh, Predicted: kat
|
| 63 |
+
Key: amh_94002_A_20140511_172143_000793, Target: amh, Predicted: tgl
|
| 64 |
+
Key: amh_95124_A_20140828_224047_038927, Target: amh, Predicted: lit
|
| 65 |
+
Key: amh_96940_B_20140901_181148_007703, Target: amh, Predicted: kaz
|
| 66 |
+
Key: amh_94237_A_20140814_181922_050462, Target: amh, Predicted: hat
|
| 67 |
+
Key: amh_94237_A_20140814_181922_051539, Target: amh, Predicted: tam
|
| 68 |
+
Key: amh_94237_A_20140814_181922_058366, Target: amh, Predicted: asm
|
| 69 |
+
Key: amh_94002_A_20140511_172143_023305, Target: amh, Predicted: tel
|
| 70 |
+
Key: amh_95124_A_20140828_224047_004869, Target: amh, Predicted: kat
|
| 71 |
+
Key: amh_96940_B_20140901_181148_039855, Target: amh, Predicted: kaz
|
| 72 |
+
Key: amh_98506_A_20140807_170934_060854, Target: amh, Predicted: kaz
|
| 73 |
+
Key: asm_34446_B_20120426_195519_020649, Target: asm, Predicted: ben
|
| 74 |
+
Key: asm_33969_B_20130123_165132_045069, Target: asm, Predicted: tam
|
| 75 |
+
Key: asm_33704_A_20130204_172729_034778, Target: asm, Predicted: tam
|
| 76 |
+
Key: asm_33704_A_20130204_172729_049460, Target: asm, Predicted: ben
|
| 77 |
+
Key: asm_43587_A_20120607_204145_034715, Target: asm, Predicted: tel
|
| 78 |
+
Key: asm_40385_A_20121224_164959_041220, Target: asm, Predicted: tel
|
| 79 |
+
Key: asm_40385_B_20121224_164959_020689, Target: asm, Predicted: tgl
|
| 80 |
+
Key: asm_46593_B_20121010_023019_043252, Target: asm, Predicted: ben
|
| 81 |
+
Key: asm_46593_B_20121010_023019_046408, Target: asm, Predicted: ben
|
| 82 |
+
Key: asm_47429_A_20130121_172000_012339, Target: asm, Predicted: vie
|
| 83 |
+
Key: asm_59544_B_20120401_222134_013082, Target: asm, Predicted: ben
|
| 84 |
+
Key: asm_80856_A_20120423_184225_031581, Target: asm, Predicted: ben
|
| 85 |
+
Key: asm_79519_B_20121008_214502_049044, Target: asm, Predicted: tgl
|
| 86 |
+
Key: asm_66668_B_20120409_185702_056020, Target: asm, Predicted: tel
|
| 87 |
+
Key: asm_80856_A_20120423_184225_058863, Target: asm, Predicted: tgl
|
| 88 |
+
Key: asm_87885_A_20121113_193407_023378, Target: asm, Predicted: ben
|
| 89 |
+
Key: asm_87885_A_20121113_193407_024567, Target: asm, Predicted: tel
|
| 90 |
+
Key: asm_87671_B_20120401_172420_054685, Target: asm, Predicted: ben
|
| 91 |
+
Key: asm_87885_A_20121113_193407_044881, Target: asm, Predicted: ben
|
| 92 |
+
Key: asm_87885_A_20121113_193407_007808, Target: asm, Predicted: kmr
|
| 93 |
+
Key: asm_87885_A_20121113_193407_014210, Target: asm, Predicted: kmr
|
| 94 |
+
Key: ben_10576_A_20111221_214850_004672, Target: ben, Predicted: asm
|
| 95 |
+
Key: ben_10576_A_20111221_214850_016340, Target: ben, Predicted: asm
|
| 96 |
+
Key: ben_10576_A_20111221_214850_030232, Target: ben, Predicted: tel
|
| 97 |
+
Key: ben_10576_A_20111221_214850_036179, Target: ben, Predicted: asm
|
| 98 |
+
Key: ben_10569_B_20111221_201913_002481, Target: ben, Predicted: asm
|
| 99 |
+
Key: ben_10576_A_20111221_214850_050139, Target: ben, Predicted: asm
|
| 100 |
+
Key: ben_24810_B_20120114_225518_016801, Target: ben, Predicted: yue
|
| 101 |
+
Key: ben_21203_A_20120523_225358_000338, Target: ben, Predicted: asm
|
| 102 |
+
Key: ben_21203_A_20120523_225358_012402, Target: ben, Predicted: asm
|
| 103 |
+
Key: ben_27912_B_20120123_185402_005366, Target: ben, Predicted: asm
|
| 104 |
+
Key: ben_27912_B_20120123_185402_013907, Target: ben, Predicted: lao
|
| 105 |
+
Key: ben_27912_B_20120123_185402_040188, Target: ben, Predicted: asm
|
| 106 |
+
Key: ben_38382_B_20120110_013824_008463, Target: ben, Predicted: tel
|
| 107 |
+
Key: ben_38382_B_20120110_013824_009617, Target: ben, Predicted: tel
|
| 108 |
+
Key: ben_38382_B_20120110_013824_015051, Target: ben, Predicted: asm
|
| 109 |
+
Key: ben_40114_A_20120122_183602_035788, Target: ben, Predicted: asm
|
| 110 |
+
Key: ben_40114_A_20120122_183602_049079, Target: ben, Predicted: asm
|
| 111 |
+
Key: ben_44799_B_20120131_222925_044707, Target: ben, Predicted: asm
|
| 112 |
+
Key: ben_40114_B_20120122_183602_002987, Target: ben, Predicted: tel
|
| 113 |
+
Key: ben_40114_B_20120122_183602_025113, Target: ben, Predicted: asm
|
| 114 |
+
Key: ben_50583_B_20120114_233345_025147, Target: ben, Predicted: tel
|
| 115 |
+
Key: ben_50910_B_20120122_001708_020387, Target: ben, Predicted: asm
|
| 116 |
+
Key: ben_50910_B_20120122_001708_050086, Target: ben, Predicted: gug
|
| 117 |
+
Key: ben_44799_A_20120131_222925_022756, Target: ben, Predicted: asm
|
| 118 |
+
Key: ben_44799_A_20120131_222925_025503, Target: ben, Predicted: asm
|
| 119 |
+
Key: ben_44799_A_20120131_222925_031017, Target: ben, Predicted: asm
|
| 120 |
+
Key: ben_53805_B_20120126_211949_044950, Target: ben, Predicted: tel
|
| 121 |
+
Key: ben_62169_A_20120304_153842_051418, Target: ben, Predicted: asm
|
| 122 |
+
Key: ben_53805_B_20120126_211949_048578, Target: ben, Predicted: asm
|
| 123 |
+
Key: ben_53805_B_20120126_211949_054532, Target: ben, Predicted: tel
|
| 124 |
+
Key: ben_57721_A_20120531_194610_023753, Target: ben, Predicted: asm
|
| 125 |
+
Key: ben_52845_B_20120126_200807_030406, Target: ben, Predicted: asm
|
| 126 |
+
Key: ben_52845_B_20120126_200807_034210, Target: ben, Predicted: asm
|
| 127 |
+
Key: ben_62038_B_20111230_004215_016225, Target: ben, Predicted: yue
|
| 128 |
+
Key: ben_53805_A_20120126_211949_037154, Target: ben, Predicted: yue
|
| 129 |
+
Key: ben_62169_A_20120304_153842_019993, Target: ben, Predicted: asm
|
| 130 |
+
Key: ben_63220_A_20120514_232049_025353, Target: ben, Predicted: asm
|
| 131 |
+
Key: ben_62169_A_20120304_153842_024495, Target: ben, Predicted: asm
|
| 132 |
+
Key: ben_62169_A_20120304_153842_027469, Target: ben, Predicted: asm
|
| 133 |
+
Key: ben_62169_A_20120304_153842_039193, Target: ben, Predicted: asm
|
| 134 |
+
Key: ben_62169_A_20120304_153842_041729, Target: ben, Predicted: asm
|
| 135 |
+
Key: ben_63220_B_20120514_232049_020791, Target: ben, Predicted: asm
|
| 136 |
+
Key: ben_63220_B_20120514_232049_021921, Target: ben, Predicted: asm
|
| 137 |
+
Key: ben_65895_A_20120229_202918_036080, Target: ben, Predicted: tel
|
| 138 |
+
Key: ben_65895_A_20120229_202918_046912, Target: ben, Predicted: asm
|
| 139 |
+
Key: ben_66313_B_20120229_230907_037485, Target: ben, Predicted: tel
|
| 140 |
+
Key: ben_65895_A_20120229_202918_011382, Target: ben, Predicted: asm
|
| 141 |
+
Key: ben_80875_A_20120522_224314_028055, Target: ben, Predicted: asm
|
| 142 |
+
Key: ben_86207_B_20120127_145936_022109, Target: ben, Predicted: asm
|
| 143 |
+
Key: ben_80875_A_20120522_224314_031448, Target: ben, Predicted: asm
|
| 144 |
+
Key: ben_80875_A_20120522_224314_033805, Target: ben, Predicted: asm
|
| 145 |
+
Key: ben_80875_A_20120522_224314_037820, Target: ben, Predicted: asm
|
| 146 |
+
Key: ben_80875_A_20120522_224314_044450, Target: ben, Predicted: tel
|
| 147 |
+
Key: ben_80875_A_20120522_224314_050018, Target: ben, Predicted: asm
|
| 148 |
+
Key: ben_81773_B_20120101_024120_043949, Target: ben, Predicted: asm
|
| 149 |
+
Key: ben_80875_A_20120522_224314_011161, Target: ben, Predicted: tel
|
| 150 |
+
Key: ben_91275_A_20120529_195749_013758, Target: ben, Predicted: asm
|
| 151 |
+
Key: ben_91275_A_20120529_195749_014937, Target: ben, Predicted: asm
|
| 152 |
+
Key: ben_91275_A_20120529_195749_018687, Target: ben, Predicted: asm
|
| 153 |
+
Key: ben_91275_A_20120529_195749_023140, Target: ben, Predicted: tel
|
| 154 |
+
Key: ceb_15638_B_20131210_131327_018092, Target: ceb, Predicted: tgl
|
| 155 |
+
Key: ben_91275_A_20120529_195749_025289, Target: ben, Predicted: asm
|
| 156 |
+
Key: ben_93273_B_20120123_022109_041146, Target: ben, Predicted: tel
|
| 157 |
+
Key: ben_91275_A_20120529_195749_043069, Target: ben, Predicted: asm
|
| 158 |
+
Key: ben_95826_A_20120201_001701_020909, Target: ben, Predicted: yue
|
| 159 |
+
Key: ben_95826_B_20120201_001701_006650, Target: ben, Predicted: asm
|
| 160 |
+
Key: ceb_14141_B_20140118_202248_001941, Target: ceb, Predicted: lao
|
| 161 |
+
Key: ceb_14141_B_20140118_202248_008492, Target: ceb, Predicted: jav
|
| 162 |
+
Key: ceb_14141_B_20140118_202248_015284, Target: ceb, Predicted: amh
|
| 163 |
+
Key: ceb_14141_B_20140118_202248_016622, Target: ceb, Predicted: tgl
|
| 164 |
+
Key: ceb_15262_A_20131105_213812_038869, Target: ceb, Predicted: asm
|
| 165 |
+
Key: ceb_21109_A_20140102_180619_050237, Target: ceb, Predicted: tgl
|
| 166 |
+
Key: ceb_17881_B_20140122_201653_009579, Target: ceb, Predicted: tel
|
| 167 |
+
Key: ceb_22466_A_20131015_174457_021603, Target: ceb, Predicted: tgl
|
| 168 |
+
Key: ceb_17881_B_20140122_201653_034245, Target: ceb, Predicted: asm
|
| 169 |
+
Key: ceb_21109_A_20140102_180619_017721, Target: ceb, Predicted: jav
|
| 170 |
+
Key: ceb_22466_A_20131015_174457_022828, Target: ceb, Predicted: tgl
|
| 171 |
+
Key: ceb_21109_A_20140102_180619_018853, Target: ceb, Predicted: jav
|
| 172 |
+
Key: ceb_22466_A_20131015_174457_025528, Target: ceb, Predicted: tgl
|
| 173 |
+
Key: ceb_21109_A_20140102_180619_019994, Target: ceb, Predicted: jav
|
| 174 |
+
Key: ceb_22466_A_20131015_174457_031272, Target: ceb, Predicted: tgl
|
| 175 |
+
Key: ceb_22466_A_20131015_174457_033555, Target: ceb, Predicted: tgl
|
| 176 |
+
Key: ceb_21109_A_20140102_180619_024469, Target: ceb, Predicted: jav
|
| 177 |
+
Key: ceb_22466_A_20131015_174457_052722, Target: ceb, Predicted: tgl
|
| 178 |
+
Key: ceb_21109_A_20140102_180619_025655, Target: ceb, Predicted: jav
|
| 179 |
+
Key: ceb_22466_B_20131015_174457_001633, Target: ceb, Predicted: kmr
|
| 180 |
+
Key: ceb_21109_A_20140102_180619_031392, Target: ceb, Predicted: tgl
|
| 181 |
+
Key: ceb_22466_B_20131015_174457_045431, Target: ceb, Predicted: tgl
|
| 182 |
+
Key: ceb_22466_B_20131015_174457_051524, Target: ceb, Predicted: tgl
|
| 183 |
+
Key: ceb_21109_A_20140102_180619_040169, Target: ceb, Predicted: jav
|
| 184 |
+
Key: ceb_21109_A_20140102_180619_041294, Target: ceb, Predicted: jav
|
| 185 |
+
Key: ceb_21109_A_20140102_180619_042402, Target: ceb, Predicted: jav
|
| 186 |
+
Key: ceb_38340_B_20131128_145618_035396, Target: ceb, Predicted: asm
|
| 187 |
+
Key: ceb_38340_B_20131128_145618_044704, Target: ceb, Predicted: tgl
|
| 188 |
+
Key: ceb_36059_B_20140118_204512_003449, Target: ceb, Predicted: tgl
|
| 189 |
+
Key: ceb_38340_B_20131128_145618_050471, Target: ceb, Predicted: tgl
|
| 190 |
+
Key: ceb_38340_B_20131128_145618_001728, Target: ceb, Predicted: tgl
|
| 191 |
+
Key: ceb_38340_B_20131128_145618_028374, Target: ceb, Predicted: tgl
|
| 192 |
+
Key: ceb_43646_A_20131019_165638_004395, Target: ceb, Predicted: tgl
|
| 193 |
+
Key: ceb_50565_B_20131025_202729_012748, Target: ceb, Predicted: asm
|
| 194 |
+
Key: ceb_43646_A_20131019_165638_019162, Target: ceb, Predicted: tgl
|
| 195 |
+
Key: ceb_43646_A_20131019_165638_027625, Target: ceb, Predicted: asm
|
| 196 |
+
Key: ceb_51530_B_20140125_195307_042590, Target: ceb, Predicted: tgl
|
| 197 |
+
Key: ceb_51530_B_20140125_195307_043726, Target: ceb, Predicted: tgl
|
| 198 |
+
Key: ceb_51530_B_20140125_195307_055117, Target: ceb, Predicted: hat
|
| 199 |
+
Key: ceb_56370_A_20131101_175739_018773, Target: ceb, Predicted: lao
|
| 200 |
+
Key: ceb_56370_B_20131101_175739_043790, Target: ceb, Predicted: tgl
|
| 201 |
+
Key: ceb_54744_B_20131202_184432_002469, Target: ceb, Predicted: asm
|
| 202 |
+
Key: ceb_54744_B_20131202_184432_003641, Target: ceb, Predicted: asm
|
| 203 |
+
Key: ceb_60299_A_20140202_130806_026932, Target: ceb, Predicted: lao
|
| 204 |
+
Key: ceb_60299_A_20140202_130806_030919, Target: ceb, Predicted: tam
|
| 205 |
+
Key: ceb_54744_B_20131202_184432_014262, Target: ceb, Predicted: tgl
|
| 206 |
+
Key: ceb_60299_A_20140202_130806_047310, Target: ceb, Predicted: tgl
|
| 207 |
+
Key: ceb_54744_B_20131202_184432_036018, Target: ceb, Predicted: asm
|
| 208 |
+
Key: ceb_60299_A_20140202_130806_053018, Target: ceb, Predicted: asm
|
| 209 |
+
Key: ceb_54744_B_20131202_184432_044887, Target: ceb, Predicted: asm
|
| 210 |
+
Key: ceb_56370_A_20131101_175739_004673, Target: ceb, Predicted: tgl
|
| 211 |
+
Key: ceb_81427_A_20131126_151401_058032, Target: ceb, Predicted: tel
|
| 212 |
+
Key: ceb_84611_A_20131125_193454_001166, Target: ceb, Predicted: tgl
|
| 213 |
+
Key: ceb_79660_A_20140201_160331_000129, Target: ceb, Predicted: tgl
|
| 214 |
+
Key: ceb_74455_A_20140115_152935_051492, Target: ceb, Predicted: ben
|
| 215 |
+
Key: ceb_74455_B_20140115_152935_007394, Target: ceb, Predicted: tgl
|
| 216 |
+
Key: ceb_74455_B_20140115_152935_015341, Target: ceb, Predicted: tgl
|
| 217 |
+
Key: ceb_79660_A_20140201_160331_046537, Target: ceb, Predicted: vie
|
| 218 |
+
Key: ceb_86467_A_20131112_182159_030337, Target: ceb, Predicted: tgl
|
| 219 |
+
Key: ceb_86467_B_20131112_193636_008827, Target: ceb, Predicted: tgl
|
| 220 |
+
Key: ceb_86467_B_20131112_193636_017008, Target: ceb, Predicted: tgl
|
| 221 |
+
Key: ceb_85179_A_20131227_172225_003961, Target: ceb, Predicted: tgl
|
| 222 |
+
Key: ceb_96985_A_20131021_164130_003454, Target: ceb, Predicted: tgl
|
| 223 |
+
Key: ceb_96985_A_20131021_164130_042953, Target: ceb, Predicted: tam
|
| 224 |
+
Key: ceb_98489_A_20131123_233440_004829, Target: ceb, Predicted: tgl
|
| 225 |
+
Key: ceb_85179_A_20131227_172225_021268, Target: ceb, Predicted: lao
|
| 226 |
+
Key: gug_21624_A_20150222_054542_006999, Target: gug, Predicted: tel
|
| 227 |
+
Key: gug_21624_A_20150222_054542_008195, Target: gug, Predicted: tel
|
| 228 |
+
Key: gug_21624_A_20150222_054542_021054, Target: gug, Predicted: tel
|
| 229 |
+
Key: gug_21624_A_20150222_054542_023373, Target: gug, Predicted: tel
|
| 230 |
+
Key: gug_21004_B_20150217_083755_046019, Target: gug, Predicted: tpi
|
| 231 |
+
Key: gug_21004_B_20150217_083755_048475, Target: gug, Predicted: ibo
|
| 232 |
+
Key: gug_23006_A_20140807_062702_004252, Target: gug, Predicted: luo
|
| 233 |
+
Key: gug_39555_A_20141023_010258_027629, Target: gug, Predicted: lao
|
| 234 |
+
Key: gug_41685_A_20150320_083024_019491, Target: gug, Predicted: tel
|
| 235 |
+
Key: gug_41685_A_20150320_083024_050188, Target: gug, Predicted: tel
|
| 236 |
+
Key: gug_23006_B_20140807_062702_021561, Target: gug, Predicted: zul
|
| 237 |
+
Key: gug_43395_B_20150303_092614_017102, Target: gug, Predicted: vie
|
| 238 |
+
Key: gug_43395_B_20150303_092614_043917, Target: gug, Predicted: lao
|
| 239 |
+
Key: gug_50810_B_20140619_063147_011354, Target: gug, Predicted: lao
|
| 240 |
+
Key: gug_50810_B_20140619_063147_023949, Target: gug, Predicted: jav
|
| 241 |
+
Key: gug_44619_B_20140621_050143_005200, Target: gug, Predicted: lao
|
| 242 |
+
Key: gug_50810_B_20140619_063147_034848, Target: gug, Predicted: tur
|
| 243 |
+
Key: gug_50090_A_20150206_002321_001260, Target: gug, Predicted: tam
|
| 244 |
+
Key: gug_50090_B_20150206_002321_026694, Target: gug, Predicted: tur
|
| 245 |
+
Key: gug_56019_A_20150221_084856_048093, Target: gug, Predicted: tel
|
| 246 |
+
Key: gug_56019_A_20150221_084856_054819, Target: gug, Predicted: tel
|
| 247 |
+
Key: gug_56019_A_20150221_084856_012820, Target: gug, Predicted: tel
|
| 248 |
+
Key: gug_56019_A_20150221_084856_014014, Target: gug, Predicted: tel
|
| 249 |
+
Key: gug_53441_A_20140612_055846_030438, Target: gug, Predicted: tpi
|
| 250 |
+
Key: gug_56019_A_20150221_084856_016253, Target: gug, Predicted: tel
|
| 251 |
+
Key: gug_58717_A_20150201_022141_058962, Target: gug, Predicted: tur
|
| 252 |
+
Key: gug_56019_A_20150221_084856_036105, Target: gug, Predicted: tel
|
| 253 |
+
Key: gug_56019_A_20150221_084856_040658, Target: gug, Predicted: tel
|
| 254 |
+
Key: gug_78161_A_20150312_093559_034226, Target: gug, Predicted: khk
|
| 255 |
+
Key: gug_78161_A_20150312_093559_042472, Target: gug, Predicted: tel
|
| 256 |
+
Key: gug_78161_A_20150312_093559_047262, Target: gug, Predicted: tel
|
| 257 |
+
Key: gug_97911_A_20150304_082443_021658, Target: gug, Predicted: tel
|
| 258 |
+
Key: gug_97911_A_20150304_082443_026325, Target: gug, Predicted: khk
|
| 259 |
+
Key: gug_97911_A_20150304_082443_058612, Target: gug, Predicted: kmr
|
| 260 |
+
Key: gug_97911_A_20150304_082443_060424, Target: gug, Predicted: tel
|
| 261 |
+
Key: hat_14440_B_20130302_012105_008041, Target: hat, Predicted: ibo
|
| 262 |
+
Key: hat_14440_B_20130302_012105_047037, Target: hat, Predicted: kat
|
| 263 |
+
Key: hat_23983_B_20130503_023139_038952, Target: hat, Predicted: tpi
|
| 264 |
+
Key: hat_32832_A_20130430_060411_001029, Target: hat, Predicted: amh
|
| 265 |
+
Key: hat_49197_B_20130529_061436_045077, Target: hat, Predicted: nor
|
| 266 |
+
Key: hat_61357_B_20130602_030259_014385, Target: hat, Predicted: ibo
|
| 267 |
+
Key: hat_61357_B_20130602_030259_016440, Target: hat, Predicted: jav
|
| 268 |
+
Key: hat_61357_B_20130602_030259_019295, Target: hat, Predicted: lao
|
| 269 |
+
Key: hat_61357_B_20130602_030259_038622, Target: hat, Predicted: jav
|
| 270 |
+
Key: hat_61357_B_20130602_030259_044885, Target: hat, Predicted: lao
|
| 271 |
+
Key: hat_61357_B_20130602_030259_052961, Target: hat, Predicted: ibo
|
| 272 |
+
Key: hat_65640_B_20130429_103434_018865, Target: hat, Predicted: tpi
|
| 273 |
+
Key: hat_65640_B_20130429_103434_040586, Target: hat, Predicted: gug
|
| 274 |
+
Key: hat_71263_A_20130602_021725_030898, Target: hat, Predicted: ibo
|
| 275 |
+
Key: hat_77112_B_20130528_050544_000322, Target: hat, Predicted: swa
|
| 276 |
+
Key: hat_74226_B_20130303_125222_045352, Target: hat, Predicted: tpi
|
| 277 |
+
Key: hat_80881_A_20130220_022131_028792, Target: hat, Predicted: ibo
|
| 278 |
+
Key: hat_78360_B_20130430_101414_041610, Target: hat, Predicted: vie
|
| 279 |
+
Key: hat_80881_A_20130220_022131_034410, Target: hat, Predicted: gug
|
| 280 |
+
Key: hat_80881_A_20130220_022131_012911, Target: hat, Predicted: yue
|
| 281 |
+
Key: hat_79571_A_20130302_074959_009017, Target: hat, Predicted: amh
|
| 282 |
+
Key: hat_80881_A_20130220_022131_016364, Target: hat, Predicted: gug
|
| 283 |
+
Key: hat_81553_A_20130430_095301_044907, Target: hat, Predicted: gug
|
| 284 |
+
Key: ibo_13427_B_20140810_232413_045755, Target: ibo, Predicted: tpi
|
| 285 |
+
Key: ibo_19818_A_20140801_211130_040524, Target: ibo, Predicted: spa
|
| 286 |
+
Key: ibo_13427_A_20140810_232413_021572, Target: ibo, Predicted: hat
|
| 287 |
+
Key: ibo_33497_B_20140730_031414_000072, Target: ibo, Predicted: hat
|
| 288 |
+
Key: ibo_28419_B_20140606_201307_010615, Target: ibo, Predicted: luo
|
| 289 |
+
Key: ibo_35420_A_20140527_001314_003007, Target: ibo, Predicted: amh
|
| 290 |
+
Key: ibo_34197_A_20140520_215059_023638, Target: ibo, Predicted: tel
|
| 291 |
+
Key: ibo_35420_B_20140527_001314_032983, Target: ibo, Predicted: luo
|
| 292 |
+
Key: ibo_50726_A_20140521_235356_009208, Target: ibo, Predicted: tel
|
| 293 |
+
Key: ibo_50726_A_20140521_235356_011538, Target: ibo, Predicted: khk
|
| 294 |
+
Key: ibo_50726_A_20140521_235356_019433, Target: ibo, Predicted: luo
|
| 295 |
+
Key: ibo_50726_A_20140521_235356_022903, Target: ibo, Predicted: tel
|
| 296 |
+
Key: ibo_50726_A_20140521_235356_024051, Target: ibo, Predicted: kat
|
| 297 |
+
Key: ibo_50726_A_20140521_235356_031085, Target: ibo, Predicted: amh
|
| 298 |
+
Key: ibo_53842_A_20140905_005627_005670, Target: ibo, Predicted: swa
|
| 299 |
+
Key: ibo_50726_A_20140521_235356_032272, Target: ibo, Predicted: kaz
|
| 300 |
+
Key: ibo_50726_A_20140521_235356_033466, Target: ibo, Predicted: tam
|
| 301 |
+
Key: ibo_50726_A_20140521_235356_034611, Target: ibo, Predicted: kaz
|
| 302 |
+
Key: ibo_53842_A_20140905_005627_027190, Target: ibo, Predicted: tpi
|
| 303 |
+
Key: ibo_50726_A_20140521_235356_044616, Target: ibo, Predicted: kaz
|
| 304 |
+
Key: ibo_53842_A_20140905_005627_028328, Target: ibo, Predicted: zul
|
| 305 |
+
Key: ibo_52301_A_20140607_003158_025482, Target: ibo, Predicted: tel
|
| 306 |
+
Key: ibo_52301_A_20140607_003158_039732, Target: ibo, Predicted: lit
|
| 307 |
+
Key: ibo_50726_A_20140521_235356_053784, Target: ibo, Predicted: kat
|
| 308 |
+
Key: ibo_63334_A_20150216_005033_042571, Target: ibo, Predicted: hat
|
| 309 |
+
Key: ibo_63334_B_20150216_005033_011676, Target: ibo, Predicted: tpi
|
| 310 |
+
Key: ibo_63334_B_20150216_005033_016403, Target: ibo, Predicted: tpi
|
| 311 |
+
Key: ibo_58107_B_20140805_204322_048668, Target: ibo, Predicted: yue
|
| 312 |
+
Key: ibo_63334_B_20150216_005033_027618, Target: ibo, Predicted: tpi
|
| 313 |
+
Key: ibo_63334_B_20150216_005033_042556, Target: ibo, Predicted: tur
|
| 314 |
+
Key: ibo_60508_A_20140521_055301_003833, Target: ibo, Predicted: kat
|
| 315 |
+
Key: ibo_77112_B_20140609_224704_017697, Target: ibo, Predicted: swa
|
| 316 |
+
Key: ibo_77803_A_20140517_202422_000000, Target: ibo, Predicted: amh
|
| 317 |
+
Key: ibo_77803_A_20140517_202422_004727, Target: ibo, Predicted: luo
|
| 318 |
+
Key: ibo_66959_B_20141031_215547_046888, Target: ibo, Predicted: tpi
|
| 319 |
+
Key: ibo_79723_A_20150331_184104_029068, Target: ibo, Predicted: tpi
|
| 320 |
+
Key: ibo_79723_A_20150331_184104_039764, Target: ibo, Predicted: zul
|
| 321 |
+
Key: ibo_87280_A_20141026_002639_013843, Target: ibo, Predicted: lao
|
| 322 |
+
Key: ibo_87313_B_20140802_002411_026424, Target: ibo, Predicted: tpi
|
| 323 |
+
Key: ibo_94212_B_20140525_012758_040617, Target: ibo, Predicted: tgl
|
| 324 |
+
Key: jav_10184_A_20141119_194233_051384, Target: jav, Predicted: lao
|
| 325 |
+
Key: jav_10184_A_20141119_194233_017863, Target: jav, Predicted: ceb
|
| 326 |
+
Key: jav_10184_A_20141119_194233_059426, Target: jav, Predicted: lao
|
| 327 |
+
Key: jav_10184_A_20141119_194233_064088, Target: jav, Predicted: lao
|
| 328 |
+
Key: jav_15535_B_20150104_232347_044037, Target: jav, Predicted: lao
|
| 329 |
+
Key: jav_10184_A_20141119_194233_025595, Target: jav, Predicted: ceb
|
| 330 |
+
Key: jav_10184_A_20141119_194233_029118, Target: jav, Predicted: ceb
|
| 331 |
+
Key: jav_20133_B_20140911_170812_017218, Target: jav, Predicted: mlt
|
| 332 |
+
Key: jav_21581_A_20141107_151147_007012, Target: jav, Predicted: ceb
|
| 333 |
+
Key: jav_21393_B_20150304_163256_011005, Target: jav, Predicted: lao
|
| 334 |
+
Key: jav_23046_A_20141103_212247_000903, Target: jav, Predicted: msa
|
| 335 |
+
Key: jav_23505_A_20141029_003347_043611, Target: jav, Predicted: lao
|
| 336 |
+
Key: jav_23046_A_20141103_212247_032712, Target: jav, Predicted: vie
|
| 337 |
+
Key: jav_23046_A_20141103_212247_037678, Target: jav, Predicted: asm
|
| 338 |
+
Key: jav_23505_B_20141029_003347_024606, Target: jav, Predicted: luo
|
| 339 |
+
Key: jav_21807_A_20141125_194924_048994, Target: jav, Predicted: tur
|
| 340 |
+
Key: jav_27590_A_20141227_191710_047520, Target: jav, Predicted: tgl
|
| 341 |
+
Key: jav_27590_A_20141227_191710_050692, Target: jav, Predicted: cym
|
| 342 |
+
Key: jav_27590_A_20141227_191710_055289, Target: jav, Predicted: hat
|
| 343 |
+
Key: jav_36293_A_20141001_145552_001194, Target: jav, Predicted: ceb
|
| 344 |
+
Key: jav_36293_A_20141001_145552_002391, Target: jav, Predicted: ceb
|
| 345 |
+
Key: jav_36293_A_20141001_145552_003577, Target: jav, Predicted: ceb
|
| 346 |
+
Key: jav_36293_A_20141001_145552_004772, Target: jav, Predicted: ceb
|
| 347 |
+
Key: jav_36293_A_20141001_145552_005969, Target: jav, Predicted: ceb
|
| 348 |
+
Key: jav_27590_B_20141227_191710_048052, Target: jav, Predicted: tgl
|
| 349 |
+
Key: jav_36293_A_20141001_145552_012962, Target: jav, Predicted: ceb
|
| 350 |
+
Key: jav_36293_A_20141001_145552_016469, Target: jav, Predicted: lao
|
| 351 |
+
Key: jav_36293_A_20141001_145552_017668, Target: jav, Predicted: ceb
|
| 352 |
+
Key: jav_36293_A_20141001_145552_020039, Target: jav, Predicted: ceb
|
| 353 |
+
Key: jav_36505_A_20150106_201700_045871, Target: jav, Predicted: swa
|
| 354 |
+
Key: jav_36293_A_20141001_145552_022419, Target: jav, Predicted: tgl
|
| 355 |
+
Key: jav_41598_B_20150201_142509_000238, Target: jav, Predicted: tgl
|
| 356 |
+
Key: jav_36293_A_20141001_145552_023618, Target: jav, Predicted: ceb
|
| 357 |
+
Key: jav_36505_A_20150106_201700_053028, Target: jav, Predicted: lao
|
| 358 |
+
Key: jav_36293_B_20141001_145552_013357, Target: jav, Predicted: lao
|
| 359 |
+
Key: jav_36894_A_20140919_222930_000092, Target: jav, Predicted: ceb
|
| 360 |
+
Key: jav_36293_A_20141001_145552_028259, Target: jav, Predicted: ceb
|
| 361 |
+
Key: jav_36293_A_20141001_145552_030651, Target: jav, Predicted: ceb
|
| 362 |
+
Key: jav_41745_B_20141108_162338_035175, Target: jav, Predicted: sun
|
| 363 |
+
Key: jav_41745_B_20141108_162338_053557, Target: jav, Predicted: sun
|
| 364 |
+
Key: jav_41745_B_20141108_162338_055340, Target: jav, Predicted: ind
|
| 365 |
+
Key: jav_36293_A_20141001_145552_038873, Target: jav, Predicted: ceb
|
| 366 |
+
Key: jav_36505_A_20150106_201700_014759, Target: jav, Predicted: luo
|
| 367 |
+
Key: jav_36293_A_20141001_145552_043584, Target: jav, Predicted: ceb
|
| 368 |
+
Key: jav_36293_A_20141001_145552_045967, Target: jav, Predicted: ceb
|
| 369 |
+
Key: jav_36505_A_20150106_201700_033633, Target: jav, Predicted: ceb
|
| 370 |
+
Key: jav_49118_B_20150201_023112_044097, Target: jav, Predicted: tgl
|
| 371 |
+
Key: jav_52490_A_20140916_192446_040486, Target: jav, Predicted: gug
|
| 372 |
+
Key: jav_49437_B_20150112_204645_005926, Target: jav, Predicted: cym
|
| 373 |
+
Key: jav_52490_A_20140916_192446_052161, Target: jav, Predicted: kat
|
| 374 |
+
Key: jav_49437_B_20150112_204645_038219, Target: jav, Predicted: hat
|
| 375 |
+
Key: jav_52717_A_20140923_130849_023513, Target: jav, Predicted: khk
|
| 376 |
+
Key: jav_56306_A_20150103_203751_000250, Target: jav, Predicted: lao
|
| 377 |
+
Key: jav_52717_B_20140923_130849_020418, Target: jav, Predicted: lao
|
| 378 |
+
Key: jav_52717_B_20140923_130849_028193, Target: jav, Predicted: lao
|
| 379 |
+
Key: jav_65882_B_20141102_005627_039399, Target: jav, Predicted: tpi
|
| 380 |
+
Key: jav_65882_B_20141102_005627_041556, Target: jav, Predicted: tpi
|
| 381 |
+
Key: jav_64494_A_20141012_193548_027781, Target: jav, Predicted: asm
|
| 382 |
+
Key: jav_70386_B_20141116_170547_042186, Target: jav, Predicted: pus
|
| 383 |
+
Key: jav_73837_A_20141101_183259_039061, Target: jav, Predicted: khk
|
| 384 |
+
Key: jav_68289_B_20150216_010725_004241, Target: jav, Predicted: swa
|
| 385 |
+
Key: jav_68289_B_20150216_010725_012391, Target: jav, Predicted: gug
|
| 386 |
+
Key: jav_68289_B_20150216_010725_030183, Target: jav, Predicted: tgl
|
| 387 |
+
Key: jav_68289_B_20150216_010725_034736, Target: jav, Predicted: tgl
|
| 388 |
+
Key: jav_73511_A_20141226_133330_013131, Target: jav, Predicted: lao
|
| 389 |
+
Key: jav_70343_B_20150212_004248_014681, Target: jav, Predicted: khk
|
| 390 |
+
Key: jav_70386_B_20141116_170547_013684, Target: jav, Predicted: hat
|
| 391 |
+
Key: jav_78454_A_20141128_203259_000000, Target: jav, Predicted: amh
|
| 392 |
+
Key: jav_68068_B_20150119_135822_043484, Target: jav, Predicted: asm
|
| 393 |
+
Key: jav_70386_B_20141116_170547_028312, Target: jav, Predicted: asm
|
| 394 |
+
Key: jav_73837_A_20141101_183259_026069, Target: jav, Predicted: tgl
|
| 395 |
+
Key: jav_70386_B_20141116_170547_039873, Target: jav, Predicted: nep
|
| 396 |
+
Key: jav_68182_A_20150111_002528_041112, Target: jav, Predicted: vie
|
| 397 |
+
Key: jav_73837_A_20141101_183259_032808, Target: jav, Predicted: luo
|
| 398 |
+
Key: jav_86467_B_20140920_125939_040288, Target: jav, Predicted: kaz
|
| 399 |
+
Key: jav_88445_B_20141205_204305_027285, Target: jav, Predicted: tgl
|
| 400 |
+
Key: jav_82935_A_20150104_005835_023512, Target: jav, Predicted: asm
|
| 401 |
+
Key: jav_87921_B_20141225_203350_058462, Target: jav, Predicted: ces
|
| 402 |
+
Key: jav_89457_B_20141117_212710_047919, Target: jav, Predicted: tgl
|
| 403 |
+
Key: jav_89457_B_20141117_212710_051520, Target: jav, Predicted: lao
|
| 404 |
+
Key: jav_78604_A_20141031_181612_041553, Target: jav, Predicted: ceb
|
| 405 |
+
Key: jav_92176_A_20141222_021733_023517, Target: jav, Predicted: ceb
|
| 406 |
+
Key: jav_82935_B_20150104_005835_046292, Target: jav, Predicted: ceb
|
| 407 |
+
Key: jav_92176_A_20141222_021733_038411, Target: jav, Predicted: vie
|
| 408 |
+
Key: jav_92176_B_20141222_021733_005786, Target: jav, Predicted: tgl
|
| 409 |
+
Key: jav_92176_B_20141222_021733_007770, Target: jav, Predicted: tel
|
| 410 |
+
Key: jav_92176_B_20141222_021733_012229, Target: jav, Predicted: luo
|
| 411 |
+
Key: jav_92176_B_20141222_021733_035480, Target: jav, Predicted: swa
|
| 412 |
+
Key: jav_92176_B_20141222_021733_043027, Target: jav, Predicted: asm
|
| 413 |
+
Key: jav_92176_B_20141222_021733_046066, Target: jav, Predicted: sun
|
| 414 |
+
Key: jav_93632_B_20150119_150118_019742, Target: jav, Predicted: msa
|
| 415 |
+
Key: jav_93632_B_20150119_150118_049835, Target: jav, Predicted: tel
|
| 416 |
+
Key: jav_93632_B_20150119_150118_052713, Target: jav, Predicted: tgl
|
| 417 |
+
Key: kat_10184_A_20141107_212406_000114, Target: kat, Predicted: lit
|
| 418 |
+
Key: kat_17165_A_20141117_063008_033016, Target: kat, Predicted: tur
|
| 419 |
+
Key: kat_10184_A_20141107_212406_043262, Target: kat, Predicted: jav
|
| 420 |
+
Key: kat_16184_A_20141020_233508_031838, Target: kat, Predicted: hat
|
| 421 |
+
Key: kat_17472_A_20141201_023731_021216, Target: kat, Predicted: ceb
|
| 422 |
+
Key: kat_17472_A_20141201_023731_026158, Target: kat, Predicted: tur
|
| 423 |
+
Key: kat_17472_A_20141201_023731_033322, Target: kat, Predicted: sin
|
| 424 |
+
Key: kat_17472_A_20141201_023731_036303, Target: kat, Predicted: yue
|
| 425 |
+
Key: kat_17472_A_20141201_023731_038320, Target: kat, Predicted: tam
|
| 426 |
+
Key: kat_17472_A_20141201_023731_040410, Target: kat, Predicted: luo
|
| 427 |
+
Key: kat_18380_A_20141118_001754_037874, Target: kat, Predicted: tel
|
| 428 |
+
Key: kat_18380_A_20141118_001754_050009, Target: kat, Predicted: kaz
|
| 429 |
+
Key: kat_23239_A_20141127_054155_000001, Target: kat, Predicted: khk
|
| 430 |
+
Key: kat_35467_A_20141020_054030_002174, Target: kat, Predicted: ibo
|
| 431 |
+
Key: kat_38431_B_20141130_190122_043698, Target: kat, Predicted: khk
|
| 432 |
+
Key: kat_38431_B_20141130_190122_053025, Target: kat, Predicted: kaz
|
| 433 |
+
Key: kat_41592_A_20141117_033328_012603, Target: kat, Predicted: tpi
|
| 434 |
+
Key: kat_41592_A_20141117_033328_017235, Target: kat, Predicted: hat
|
| 435 |
+
Key: kat_41592_A_20141117_033328_024808, Target: kat, Predicted: hat
|
| 436 |
+
Key: kat_41592_A_20141117_033328_028265, Target: kat, Predicted: hat
|
| 437 |
+
Key: kat_41592_A_20141117_033328_030608, Target: kat, Predicted: vie
|
| 438 |
+
Key: kat_42600_A_20141029_174857_000524, Target: kat, Predicted: gug
|
| 439 |
+
Key: kat_41592_A_20141117_033328_037242, Target: kat, Predicted: swa
|
| 440 |
+
Key: kat_41592_B_20141117_033328_045799, Target: kat, Predicted: amh
|
| 441 |
+
Key: kat_41592_A_20141117_033328_041983, Target: kat, Predicted: pus
|
| 442 |
+
Key: kat_44619_A_20141028_234639_041015, Target: kat, Predicted: khk
|
| 443 |
+
Key: kat_44619_A_20141028_234639_055151, Target: kat, Predicted: khk
|
| 444 |
+
Key: kat_44619_A_20141028_234639_019716, Target: kat, Predicted: kaz
|
| 445 |
+
Key: kat_44619_A_20141028_234639_024432, Target: kat, Predicted: kmr
|
| 446 |
+
Key: kat_44619_A_20141028_234639_031498, Target: kat, Predicted: bre
|
| 447 |
+
Key: kat_47959_B_20141026_214447_024462, Target: kat, Predicted: ben
|
| 448 |
+
Key: kat_51955_A_20141024_012212_000000, Target: kat, Predicted: amh
|
| 449 |
+
Key: kat_56826_B_20141201_042429_027677, Target: kat, Predicted: kmr
|
| 450 |
+
Key: kat_61190_A_20141029_013447_018598, Target: kat, Predicted: kaz
|
| 451 |
+
Key: kat_61190_A_20141029_013447_034683, Target: kat, Predicted: khk
|
| 452 |
+
Key: kat_73757_A_20141117_025704_005504, Target: kat, Predicted: tur
|
| 453 |
+
Key: kat_73757_A_20141117_025704_008069, Target: kat, Predicted: tur
|
| 454 |
+
Key: kat_73757_A_20141117_025704_010415, Target: kat, Predicted: tur
|
| 455 |
+
Key: kat_73757_A_20141117_025704_019493, Target: kat, Predicted: kaz
|
| 456 |
+
Key: kat_73757_A_20141117_025704_020655, Target: kat, Predicted: tur
|
| 457 |
+
Key: kat_73757_A_20141117_025704_021743, Target: kat, Predicted: tur
|
| 458 |
+
Key: kat_73757_A_20141117_025704_028261, Target: kat, Predicted: tur
|
| 459 |
+
Key: kat_73757_A_20141117_025704_029368, Target: kat, Predicted: tur
|
| 460 |
+
Key: kat_73757_A_20141117_025704_030527, Target: kat, Predicted: tur
|
| 461 |
+
Key: kat_73757_A_20141117_025704_031627, Target: kat, Predicted: lit
|
| 462 |
+
Key: kat_74121_A_20141120_020705_056254, Target: kat, Predicted: tur
|
| 463 |
+
Key: kat_73757_A_20141117_025704_036336, Target: kat, Predicted: khk
|
| 464 |
+
Key: kat_73757_A_20141117_025704_042111, Target: kat, Predicted: kaz
|
| 465 |
+
Key: kat_73757_A_20141117_025704_050281, Target: kat, Predicted: tur
|
| 466 |
+
Key: kat_73757_A_20141117_025704_053596, Target: kat, Predicted: zul
|
| 467 |
+
Key: kat_81424_B_20141123_000421_002417, Target: kat, Predicted: ibo
|
| 468 |
+
Key: kat_80781_A_20141104_212234_029772, Target: kat, Predicted: khk
|
| 469 |
+
Key: kat_87298_A_20141025_213601_000000, Target: kat, Predicted: kaz
|
| 470 |
+
Key: kat_87313_A_20141119_014632_003672, Target: kat, Predicted: kaz
|
| 471 |
+
Key: kat_87313_A_20141119_014632_008923, Target: kat, Predicted: ceb
|
| 472 |
+
Key: kat_87298_A_20141025_213601_050430, Target: kat, Predicted: khk
|
| 473 |
+
Key: kat_87298_A_20141025_213601_053980, Target: kat, Predicted: tam
|
| 474 |
+
Key: kat_87313_A_20141119_014632_061503, Target: kat, Predicted: khk
|
| 475 |
+
Key: kat_87298_A_20141025_213601_055170, Target: kat, Predicted: kaz
|
| 476 |
+
Key: kat_87298_A_20141025_213601_056354, Target: kat, Predicted: jav
|
| 477 |
+
Key: kat_87313_B_20141119_014632_047555, Target: kat, Predicted: gug
|
| 478 |
+
Key: kat_87298_A_20141025_213601_034714, Target: kat, Predicted: khk
|
| 479 |
+
Key: kat_87298_A_20141025_213601_035904, Target: kat, Predicted: khk
|
| 480 |
+
Key: kat_88776_A_20141006_193621_056655, Target: kat, Predicted: zul
|
| 481 |
+
Key: kaz_20768_A_20140203_190423_002434, Target: kaz, Predicted: tur
|
| 482 |
+
Key: kaz_20768_A_20140203_190423_019244, Target: kaz, Predicted: tur
|
| 483 |
+
Key: kaz_17573_A_20140312_030325_027741, Target: kaz, Predicted: tpi
|
| 484 |
+
Key: kaz_20768_A_20140203_190423_026665, Target: kaz, Predicted: amh
|
| 485 |
+
Key: kaz_20768_A_20140203_190423_033715, Target: kaz, Predicted: tur
|
| 486 |
+
Key: kaz_20682_A_20140114_221052_048257, Target: kaz, Predicted: gug
|
| 487 |
+
Key: kaz_20768_A_20140203_190423_034895, Target: kaz, Predicted: tur
|
| 488 |
+
Key: kaz_20768_A_20140203_190423_035970, Target: kaz, Predicted: tur
|
| 489 |
+
Key: kaz_17914_A_20140126_234956_004076, Target: kaz, Predicted: tam
|
| 490 |
+
Key: kaz_20768_A_20140203_185125_012980, Target: kaz, Predicted: tur
|
| 491 |
+
Key: kaz_36669_B_20131206_164229_046083, Target: kaz, Predicted: tur
|
| 492 |
+
Key: kaz_33175_B_20131105_201906_003032, Target: kaz, Predicted: kmr
|
| 493 |
+
Key: kaz_44868_B_20131217_205716_001796, Target: kaz, Predicted: ibo
|
| 494 |
+
Key: kaz_44868_B_20131217_205716_004022, Target: kaz, Predicted: kmr
|
| 495 |
+
Key: kaz_23355_B_20140317_191841_029508, Target: kaz, Predicted: khk
|
| 496 |
+
Key: kaz_41174_B_20131212_200450_053004, Target: kaz, Predicted: khk
|
| 497 |
+
Key: kaz_23355_B_20140317_191841_053397, Target: kaz, Predicted: khk
|
| 498 |
+
Key: kaz_24589_A_20131129_215929_000010, Target: kaz, Predicted: lit
|
| 499 |
+
Key: kaz_70110_A_20131109_190313_007919, Target: kaz, Predicted: tur
|
| 500 |
+
Key: kaz_47156_A_20140313_011009_046718, Target: kaz, Predicted: vie
|
| 501 |
+
Key: kaz_72654_A_20131207_162604_000403, Target: kaz, Predicted: kmr
|
| 502 |
+
Key: kaz_50726_A_20131118_025621_023121, Target: kaz, Predicted: tur
|
| 503 |
+
Key: kaz_72654_A_20131207_162604_044645, Target: kaz, Predicted: kmr
|
| 504 |
+
Key: kaz_72654_B_20131207_162604_031261, Target: kaz, Predicted: asm
|
| 505 |
+
Key: kaz_77730_B_20131114_230511_028376, Target: kaz, Predicted: amh
|
| 506 |
+
Key: kaz_77730_B_20131114_230511_029459, Target: kaz, Predicted: tpi
|
| 507 |
+
Key: kaz_93320_B_20140218_173001_042863, Target: kaz, Predicted: tur
|
| 508 |
+
Key: kaz_96842_A_20140131_154710_036513, Target: kaz, Predicted: khk
|
| 509 |
+
Key: kaz_96842_A_20140131_154710_048679, Target: kaz, Predicted: tpi
|
| 510 |
+
Key: khk_12916_B_20140930_182205_051257, Target: khk, Predicted: hat
|
| 511 |
+
Key: khk_12916_B_20140930_182205_052432, Target: khk, Predicted: tam
|
| 512 |
+
Key: kaz_96842_B_20140131_154710_013611, Target: kaz, Predicted: khk
|
| 513 |
+
Key: kaz_96842_A_20140131_154710_015068, Target: kaz, Predicted: tur
|
| 514 |
+
Key: kaz_96842_B_20140131_154710_028483, Target: kaz, Predicted: khk
|
| 515 |
+
Key: kaz_96842_A_20140131_154710_030373, Target: kaz, Predicted: tur
|
| 516 |
+
Key: khk_15163_A_20141020_201846_022885, Target: khk, Predicted: spa
|
| 517 |
+
Key: khk_15324_A_20141031_194259_031379, Target: khk, Predicted: tam
|
| 518 |
+
Key: khk_15324_A_20141031_194259_037917, Target: khk, Predicted: kat
|
| 519 |
+
Key: khk_29208_B_20141018_152040_004635, Target: khk, Predicted: kaz
|
| 520 |
+
Key: khk_29208_B_20141018_152040_010368, Target: khk, Predicted: amh
|
| 521 |
+
Key: khk_29208_B_20141018_152040_013675, Target: khk, Predicted: zul
|
| 522 |
+
Key: khk_29208_B_20141018_152040_019952, Target: khk, Predicted: som
|
| 523 |
+
Key: khk_29208_B_20141018_152040_021125, Target: khk, Predicted: swa
|
| 524 |
+
Key: khk_29208_B_20141018_152040_024507, Target: khk, Predicted: ibo
|
| 525 |
+
Key: khk_29208_B_20141018_152040_026705, Target: khk, Predicted: hat
|
| 526 |
+
Key: khk_32861_B_20141112_183418_031908, Target: khk, Predicted: jav
|
| 527 |
+
Key: khk_32914_B_20141101_192546_000024, Target: khk, Predicted: tam
|
| 528 |
+
Key: khk_32914_B_20141101_192546_001223, Target: khk, Predicted: kaz
|
| 529 |
+
Key: khk_29208_B_20141018_152040_036119, Target: khk, Predicted: pus
|
| 530 |
+
Key: khk_32301_A_20140927_150237_007302, Target: khk, Predicted: ibo
|
| 531 |
+
Key: khk_32301_A_20140927_150237_036562, Target: khk, Predicted: nno
|
| 532 |
+
Key: khk_29208_B_20141018_152040_056155, Target: khk, Predicted: ibo
|
| 533 |
+
Key: khk_32914_B_20141101_192546_054968, Target: khk, Predicted: swa
|
| 534 |
+
Key: khk_41741_A_20141002_230232_018106, Target: khk, Predicted: hat
|
| 535 |
+
Key: khk_42243_B_20140924_154551_016645, Target: khk, Predicted: hat
|
| 536 |
+
Key: khk_42243_B_20140924_154551_019014, Target: khk, Predicted: hat
|
| 537 |
+
Key: khk_42243_B_20140924_154551_023792, Target: khk, Predicted: kmr
|
| 538 |
+
Key: khk_38554_A_20140917_124843_000359, Target: khk, Predicted: hat
|
| 539 |
+
Key: khk_42243_B_20140924_154551_028728, Target: khk, Predicted: luo
|
| 540 |
+
Key: khk_42243_B_20140924_154551_031039, Target: khk, Predicted: kaz
|
| 541 |
+
Key: khk_42243_B_20140924_154551_032234, Target: khk, Predicted: hat
|
| 542 |
+
Key: khk_42243_B_20140924_154551_033396, Target: khk, Predicted: hat
|
| 543 |
+
Key: khk_42243_B_20140924_154551_038963, Target: khk, Predicted: kaz
|
| 544 |
+
Key: khk_42243_B_20140924_154551_041321, Target: khk, Predicted: hat
|
| 545 |
+
Key: khk_42243_B_20140924_154551_043694, Target: khk, Predicted: ibo
|
| 546 |
+
Key: khk_42243_B_20140924_154551_044893, Target: khk, Predicted: swa
|
| 547 |
+
Key: khk_42243_B_20140924_154551_046082, Target: khk, Predicted: kat
|
| 548 |
+
Key: khk_41741_A_20141002_230232_000647, Target: khk, Predicted: lao
|
| 549 |
+
Key: khk_42243_B_20140924_154551_048430, Target: khk, Predicted: kat
|
| 550 |
+
Key: khk_42243_B_20140924_154551_008138, Target: khk, Predicted: amh
|
| 551 |
+
Key: khk_42243_B_20140924_154551_053068, Target: khk, Predicted: swa
|
| 552 |
+
Key: khk_42243_B_20140924_154551_055972, Target: khk, Predicted: kmr
|
| 553 |
+
Key: khk_43789_A_20141020_153059_005612, Target: khk, Predicted: amh
|
| 554 |
+
Key: khk_43789_A_20141020_153059_058315, Target: khk, Predicted: luo
|
| 555 |
+
Key: khk_44347_B_20141103_201828_003178, Target: khk, Predicted: tpi
|
| 556 |
+
Key: khk_61678_A_20140919_183209_007194, Target: khk, Predicted: kaz
|
| 557 |
+
Key: khk_48200_B_20141104_174608_001562, Target: khk, Predicted: tam
|
| 558 |
+
Key: khk_48200_B_20141104_174608_011814, Target: khk, Predicted: kaz
|
| 559 |
+
Key: khk_61678_A_20140919_183209_015134, Target: khk, Predicted: yue
|
| 560 |
+
Key: khk_56090_A_20140917_155639_034508, Target: khk, Predicted: yue
|
| 561 |
+
Key: khk_61678_A_20140919_183209_047196, Target: khk, Predicted: asm
|
| 562 |
+
Key: khk_61678_A_20140919_183209_053397, Target: khk, Predicted: kaz
|
| 563 |
+
Key: khk_61011_A_20140919_134829_037385, Target: khk, Predicted: yue
|
| 564 |
+
Key: khk_61678_A_20140919_183209_054582, Target: khk, Predicted: tpi
|
| 565 |
+
Key: khk_61678_A_20140919_183209_057627, Target: khk, Predicted: kaz
|
| 566 |
+
Key: khk_78544_A_20140924_155131_014754, Target: khk, Predicted: kaz
|
| 567 |
+
Key: kmr_14229_B_20130325_212616_027274, Target: kmr, Predicted: tur
|
| 568 |
+
Key: khk_87884_B_20141014_190149_034467, Target: khk, Predicted: luo
|
| 569 |
+
Key: kmr_16787_B_20130323_072114_020661, Target: kmr, Predicted: tur
|
| 570 |
+
Key: kmr_16787_B_20130323_072114_053722, Target: kmr, Predicted: tur
|
| 571 |
+
Key: kmr_15638_B_20130331_200208_030577, Target: kmr, Predicted: amh
|
| 572 |
+
Key: kmr_16056_A_20130323_010902_056031, Target: kmr, Predicted: tur
|
| 573 |
+
Key: kmr_22288_A_20131228_021559_008179, Target: kmr, Predicted: ceb
|
| 574 |
+
Key: kmr_22288_A_20131228_021559_014122, Target: kmr, Predicted: tur
|
| 575 |
+
Key: kmr_26206_A_20130507_004626_009278, Target: kmr, Predicted: gug
|
| 576 |
+
Key: kmr_20454_B_20140125_002855_001925, Target: kmr, Predicted: kaz
|
| 577 |
+
Key: kmr_20454_B_20140125_002855_003117, Target: kmr, Predicted: pus
|
| 578 |
+
Key: kmr_22288_A_20131228_021559_000000, Target: kmr, Predicted: ceb
|
| 579 |
+
Key: kmr_26999_A_20130414_220838_035286, Target: kmr, Predicted: urd
|
| 580 |
+
Key: kmr_22288_A_20131228_021559_004683, Target: kmr, Predicted: ceb
|
| 581 |
+
Key: kmr_26999_A_20130414_220838_053277, Target: kmr, Predicted: pus
|
| 582 |
+
Key: kmr_34336_A_20130325_005404_056036, Target: kmr, Predicted: tur
|
| 583 |
+
Key: kmr_29039_A_20130401_012825_032029, Target: kmr, Predicted: amh
|
| 584 |
+
Key: kmr_35069_A_20130407_023338_000875, Target: kmr, Predicted: tur
|
| 585 |
+
Key: kmr_31919_A_20130413_172911_000011, Target: kmr, Predicted: tur
|
| 586 |
+
Key: kmr_29135_B_20130303_025305_050023, Target: kmr, Predicted: tur
|
| 587 |
+
Key: kmr_29039_A_20130401_012825_004886, Target: kmr, Predicted: pus
|
| 588 |
+
Key: kmr_46535_A_20140108_201338_006589, Target: kmr, Predicted: lit
|
| 589 |
+
Key: kmr_35788_A_20131231_021724_026943, Target: kmr, Predicted: kaz
|
| 590 |
+
Key: kmr_35788_A_20131231_021724_028134, Target: kmr, Predicted: kaz
|
| 591 |
+
Key: kmr_46535_A_20140108_201338_011066, Target: kmr, Predicted: khk
|
| 592 |
+
Key: kmr_35788_A_20131231_021724_029322, Target: kmr, Predicted: kaz
|
| 593 |
+
Key: kmr_35788_A_20131231_021724_037053, Target: kmr, Predicted: lit
|
| 594 |
+
Key: kmr_35788_A_20131231_021724_057350, Target: kmr, Predicted: khk
|
| 595 |
+
Key: kmr_46535_A_20140108_201338_038179, Target: kmr, Predicted: kaz
|
| 596 |
+
Key: kmr_35788_A_20131231_021724_064099, Target: kmr, Predicted: kaz
|
| 597 |
+
Key: kmr_60830_A_20131223_005744_047132, Target: kmr, Predicted: khk
|
| 598 |
+
Key: kmr_60830_A_20131223_005744_017741, Target: kmr, Predicted: kat
|
| 599 |
+
Key: kmr_54735_A_20131228_012336_006164, Target: kmr, Predicted: khk
|
| 600 |
+
Key: kmr_54735_A_20131228_012336_067811, Target: kmr, Predicted: lit
|
| 601 |
+
Key: kmr_79139_A_20130621_004019_041961, Target: kmr, Predicted: gug
|
| 602 |
+
Key: kmr_72903_A_20131225_002056_059562, Target: kmr, Predicted: tur
|
| 603 |
+
Key: kmr_77225_A_20140106_235541_046086, Target: kmr, Predicted: khk
|
| 604 |
+
Key: kmr_77225_A_20140106_235541_047285, Target: kmr, Predicted: khk
|
| 605 |
+
Key: kmr_77225_A_20140106_235541_002601, Target: kmr, Predicted: khk
|
| 606 |
+
Key: kmr_77225_A_20140106_235541_052011, Target: kmr, Predicted: tpi
|
| 607 |
+
Key: kmr_77225_A_20140106_235541_054379, Target: kmr, Predicted: lit
|
| 608 |
+
Key: kmr_77225_A_20140106_235541_013543, Target: kmr, Predicted: khk
|
| 609 |
+
Key: kmr_86830_B_20130413_225657_001588, Target: kmr, Predicted: tgl
|
| 610 |
+
Key: kmr_77225_B_20140106_235541_031416, Target: kmr, Predicted: tpi
|
| 611 |
+
Key: kmr_77225_A_20140106_235541_027581, Target: kmr, Predicted: khk
|
| 612 |
+
Key: kmr_78360_A_20140123_011434_006644, Target: kmr, Predicted: tur
|
| 613 |
+
Key: lao_15042_A_20130727_173946_049342, Target: lao, Predicted: asm
|
| 614 |
+
Key: lao_14158_A_20130409_182411_002652, Target: lao, Predicted: yue
|
| 615 |
+
Key: lao_15042_A_20130727_173946_056098, Target: lao, Predicted: kaz
|
| 616 |
+
Key: lao_15042_A_20130727_173946_062068, Target: lao, Predicted: swe
|
| 617 |
+
Key: lao_14228_B_20130405_163836_016251, Target: lao, Predicted: vie
|
| 618 |
+
Key: lao_22466_B_20130218_191925_033283, Target: lao, Predicted: kaz
|
| 619 |
+
Key: lao_23681_A_20130730_162132_027917, Target: lao, Predicted: ceb
|
| 620 |
+
Key: lao_23681_A_20130730_162132_034911, Target: lao, Predicted: ceb
|
| 621 |
+
Key: lao_23995_A_20130731_195202_051372, Target: lao, Predicted: jav
|
| 622 |
+
Key: lao_23681_B_20130730_162132_043721, Target: lao, Predicted: tam
|
| 623 |
+
Key: lao_25012_A_20130814_141020_041372, Target: lao, Predicted: jav
|
| 624 |
+
Key: lao_23995_A_20130731_195202_055484, Target: lao, Predicted: jav
|
| 625 |
+
Key: lao_23995_A_20130731_195202_058875, Target: lao, Predicted: kaz
|
| 626 |
+
Key: lao_23995_B_20130731_195202_000006, Target: lao, Predicted: tgl
|
| 627 |
+
Key: lao_29765_B_20130426_185032_006590, Target: lao, Predicted: hat
|
| 628 |
+
Key: lao_23995_A_20130731_195202_041755, Target: lao, Predicted: kaz
|
| 629 |
+
Key: lao_41920_B_20130310_185621_038917, Target: lao, Predicted: luo
|
| 630 |
+
Key: lao_29765_B_20130426_185032_046533, Target: lao, Predicted: tam
|
| 631 |
+
Key: lao_41400_A_20130728_194416_033862, Target: lao, Predicted: jav
|
| 632 |
+
Key: lao_52025_A_20130306_143713_025120, Target: lao, Predicted: tur
|
| 633 |
+
Key: lao_60836_A_20130314_211014_025046, Target: lao, Predicted: gug
|
| 634 |
+
Key: lao_72733_A_20130731_235502_038441, Target: lao, Predicted: ceb
|
| 635 |
+
Key: lao_79190_A_20130714_135011_021885, Target: lao, Predicted: asm
|
| 636 |
+
Key: lao_84370_B_20130506_190748_025300, Target: lao, Predicted: luo
|
| 637 |
+
Key: lit_21581_A_20131216_220706_014319, Target: lit, Predicted: tpi
|
| 638 |
+
Key: lit_21581_A_20131216_220706_018756, Target: lit, Predicted: pus
|
| 639 |
+
Key: lit_21581_A_20131216_220706_019954, Target: lit, Predicted: lao
|
| 640 |
+
Key: lit_21581_A_20131216_220706_040302, Target: lit, Predicted: ibo
|
| 641 |
+
Key: lit_37064_A_20131129_035959_013167, Target: lit, Predicted: spa
|
| 642 |
+
Key: lit_46702_A_20131115_213311_054246, Target: lit, Predicted: tam
|
| 643 |
+
Key: lit_70110_B_20131118_222225_012085, Target: lit, Predicted: kat
|
| 644 |
+
Key: lit_76837_A_20131020_200525_061435, Target: lit, Predicted: kaz
|
| 645 |
+
Key: lit_70110_B_20131118_222225_028655, Target: lit, Predicted: kaz
|
| 646 |
+
Key: lit_70110_B_20131118_222225_031106, Target: lit, Predicted: luo
|
| 647 |
+
Key: lit_86878_B_20131129_043842_052347, Target: lit, Predicted: kat
|
| 648 |
+
Key: lit_86878_B_20131129_043842_056777, Target: lit, Predicted: luo
|
| 649 |
+
Key: lit_96934_A_20131207_231603_029039, Target: lit, Predicted: kmr
|
| 650 |
+
Key: luo_12220_A_20141026_204025_053435, Target: luo, Predicted: hat
|
| 651 |
+
Key: luo_14440_B_20141129_004855_047075, Target: luo, Predicted: swa
|
| 652 |
+
Key: luo_43388_A_20141028_212938_020510, Target: luo, Predicted: swa
|
| 653 |
+
Key: luo_25012_A_20150201_000040_003577, Target: luo, Predicted: amh
|
| 654 |
+
Key: luo_56090_B_20141001_220534_001800, Target: luo, Predicted: swa
|
| 655 |
+
Key: luo_56090_B_20141001_220534_035748, Target: luo, Predicted: tam
|
| 656 |
+
Key: luo_47882_B_20150131_215134_013596, Target: luo, Predicted: ibo
|
| 657 |
+
Key: luo_50726_B_20141015_222945_042179, Target: luo, Predicted: swa
|
| 658 |
+
Key: luo_45560_B_20141012_204242_000000, Target: luo, Predicted: lao
|
| 659 |
+
Key: luo_45697_A_20150211_181356_003721, Target: luo, Predicted: tam
|
| 660 |
+
Key: luo_61225_B_20141014_225524_022997, Target: luo, Predicted: hat
|
| 661 |
+
Key: luo_66026_A_20141207_212517_024451, Target: luo, Predicted: hat
|
| 662 |
+
Key: luo_66026_A_20141207_212517_026866, Target: luo, Predicted: hat
|
| 663 |
+
Key: luo_61225_B_20141014_225524_003458, Target: luo, Predicted: hat
|
| 664 |
+
Key: luo_61225_B_20141014_225524_004632, Target: luo, Predicted: swa
|
| 665 |
+
Key: luo_66026_A_20141207_212517_002144, Target: luo, Predicted: swa
|
| 666 |
+
Key: luo_72349_A_20150313_194307_008763, Target: luo, Predicted: swa
|
| 667 |
+
Key: luo_79820_A_20141005_212016_000020, Target: luo, Predicted: lao
|
| 668 |
+
Key: luo_72349_A_20150313_194307_011149, Target: luo, Predicted: lao
|
| 669 |
+
Key: luo_72349_A_20150313_194307_015010, Target: luo, Predicted: swa
|
| 670 |
+
Key: luo_79820_A_20141005_212016_025297, Target: luo, Predicted: ibo
|
| 671 |
+
Key: luo_72349_A_20150313_194307_043718, Target: luo, Predicted: gug
|
| 672 |
+
Key: luo_97264_A_20141220_220653_028177, Target: luo, Predicted: hat
|
| 673 |
+
Key: luo_97264_B_20141220_220653_009736, Target: luo, Predicted: swa
|
| 674 |
+
Key: luo_97264_B_20141220_220653_027161, Target: luo, Predicted: pus
|
| 675 |
+
Key: luo_99813_A_20141106_211637_001355, Target: luo, Predicted: hat
|
| 676 |
+
Key: pus_28102_B_20120326_171523_031756, Target: pus, Predicted: ibo
|
| 677 |
+
Key: pus_28102_A_20120326_171523_016544, Target: pus, Predicted: asm
|
| 678 |
+
Key: pus_29368_A_20120321_233801_022408, Target: pus, Predicted: kaz
|
| 679 |
+
Key: pus_29368_A_20120321_235133_019809, Target: pus, Predicted: amh
|
| 680 |
+
Key: pus_29368_A_20120321_235133_025809, Target: pus, Predicted: tpi
|
| 681 |
+
Key: pus_56226_B_20120205_235429_051101, Target: pus, Predicted: kmr
|
| 682 |
+
Key: pus_61592_B_20120126_181735_055683, Target: pus, Predicted: tam
|
| 683 |
+
Key: pus_82160_B_20120126_022907_039657, Target: pus, Predicted: lit
|
| 684 |
+
Key: pus_76812_B_20120320_180439_024771, Target: pus, Predicted: kmr
|
| 685 |
+
Key: pus_86680_B_20120309_181746_007085, Target: pus, Predicted: tur
|
| 686 |
+
Key: pus_89308_B_20120131_214111_006761, Target: pus, Predicted: kaz
|
| 687 |
+
Key: pus_89308_B_20120131_214111_013816, Target: pus, Predicted: khk
|
| 688 |
+
Key: swa_17115_A_20140218_210921_045736, Target: swa, Predicted: khk
|
| 689 |
+
Key: swa_17115_A_20140218_210921_046909, Target: swa, Predicted: tam
|
| 690 |
+
Key: swa_17115_A_20140218_210921_053530, Target: swa, Predicted: khk
|
| 691 |
+
Key: swa_17115_A_20140218_210921_055633, Target: swa, Predicted: lit
|
| 692 |
+
Key: swa_17115_A_20140218_210921_056773, Target: swa, Predicted: khk
|
| 693 |
+
Key: swa_17115_A_20140218_210921_057897, Target: swa, Predicted: tam
|
| 694 |
+
Key: swa_17115_A_20140218_210921_059104, Target: swa, Predicted: khk
|
| 695 |
+
Key: swa_16249_B_20131202_232723_000000, Target: swa, Predicted: amh
|
| 696 |
+
Key: swa_14814_A_20140205_210842_036227, Target: swa, Predicted: luo
|
| 697 |
+
Key: swa_24290_B_20140219_000423_029635, Target: swa, Predicted: ibo
|
| 698 |
+
Key: swa_17115_A_20140218_210921_008274, Target: swa, Predicted: yor
|
| 699 |
+
Key: swa_24290_B_20140219_000423_038218, Target: swa, Predicted: ibo
|
| 700 |
+
Key: swa_24290_B_20140219_000423_043494, Target: swa, Predicted: ibo
|
| 701 |
+
Key: swa_24239_A_20140206_191516_047532, Target: swa, Predicted: kmr
|
| 702 |
+
Key: swa_15420_A_20140210_010333_056109, Target: swa, Predicted: tam
|
| 703 |
+
Key: swa_24290_B_20140219_000423_046817, Target: swa, Predicted: luo
|
| 704 |
+
Key: swa_24290_B_20140219_000423_054951, Target: swa, Predicted: hat
|
| 705 |
+
Key: swa_34197_B_20121228_201800_025473, Target: swa, Predicted: amh
|
| 706 |
+
Key: swa_38588_A_20130228_211322_002708, Target: swa, Predicted: kmr
|
| 707 |
+
Key: swa_39893_B_20140115_023429_035762, Target: swa, Predicted: luo
|
| 708 |
+
Key: swa_45459_A_20131012_022245_022507, Target: swa, Predicted: amh
|
| 709 |
+
Key: swa_45459_A_20131012_022245_042952, Target: swa, Predicted: gug
|
| 710 |
+
Key: swa_45459_B_20131012_022245_051341, Target: swa, Predicted: hat
|
| 711 |
+
Key: swa_63084_B_20130801_015957_000093, Target: swa, Predicted: luo
|
| 712 |
+
Key: swa_63084_B_20130801_015957_020419, Target: swa, Predicted: luo
|
| 713 |
+
Key: swa_63084_B_20130801_015957_034096, Target: swa, Predicted: luo
|
| 714 |
+
Key: swa_59549_B_20131003_203701_010964, Target: swa, Predicted: amh
|
| 715 |
+
Key: swa_59549_B_20131003_203701_021584, Target: swa, Predicted: gug
|
| 716 |
+
Key: swa_63084_A_20130801_014407_000990, Target: swa, Predicted: luo
|
| 717 |
+
Key: swa_63084_A_20130801_014407_002124, Target: swa, Predicted: jav
|
| 718 |
+
Key: swa_63084_A_20130801_014407_003284, Target: swa, Predicted: luo
|
| 719 |
+
Key: swa_55042_B_20131217_033729_038274, Target: swa, Predicted: gug
|
| 720 |
+
Key: swa_55106_A_20131215_030617_020580, Target: swa, Predicted: luo
|
| 721 |
+
Key: swa_63084_A_20130801_015957_043913, Target: swa, Predicted: khk
|
| 722 |
+
Key: swa_55106_A_20131215_030617_036846, Target: swa, Predicted: luo
|
| 723 |
+
Key: swa_73819_B_20130911_163458_041264, Target: swa, Predicted: luo
|
| 724 |
+
Key: swa_73819_B_20130911_163458_042943, Target: swa, Predicted: hat
|
| 725 |
+
Key: swa_73819_B_20130927_003321_003623, Target: swa, Predicted: ibo
|
| 726 |
+
Key: swa_73301_A_20140226_185528_044387, Target: swa, Predicted: kaz
|
| 727 |
+
Key: swa_73301_A_20140226_185528_046773, Target: swa, Predicted: kaz
|
| 728 |
+
Key: swa_72040_B_20131002_213605_049382, Target: swa, Predicted: hat
|
| 729 |
+
Key: swa_73301_A_20140226_185528_048995, Target: swa, Predicted: lao
|
| 730 |
+
Key: swa_66822_B_20130219_222318_006840, Target: swa, Predicted: zul
|
| 731 |
+
Key: swa_73301_A_20140226_185528_050193, Target: swa, Predicted: tam
|
| 732 |
+
Key: swa_73301_A_20140226_185528_054191, Target: swa, Predicted: tam
|
| 733 |
+
Key: swa_73301_A_20140226_185528_058750, Target: swa, Predicted: kaz
|
| 734 |
+
Key: swa_73301_B_20140226_185528_034627, Target: swa, Predicted: tpi
|
| 735 |
+
Key: swa_76756_A_20130417_210400_018795, Target: swa, Predicted: kaz
|
| 736 |
+
Key: swa_77990_A_20131007_063102_055659, Target: swa, Predicted: ibo
|
| 737 |
+
Key: swa_90080_A_20140319_222809_027316, Target: swa, Predicted: tpi
|
| 738 |
+
Key: swa_88661_A_20130801_192922_004595, Target: swa, Predicted: luo
|
| 739 |
+
Key: swa_90080_A_20140319_222809_043606, Target: swa, Predicted: ibo
|
| 740 |
+
Key: swa_77990_B_20131007_063102_030607, Target: swa, Predicted: hat
|
| 741 |
+
Key: swa_77990_B_20131007_063102_031743, Target: swa, Predicted: gug
|
| 742 |
+
Key: swa_88661_A_20130801_192922_015175, Target: swa, Predicted: khk
|
| 743 |
+
Key: swa_90080_A_20140319_222809_052888, Target: swa, Predicted: kmr
|
| 744 |
+
Key: swa_77990_A_20131007_063102_018142, Target: swa, Predicted: hat
|
| 745 |
+
Key: swa_77990_A_20131007_063102_019854, Target: swa, Predicted: hat
|
| 746 |
+
Key: swa_77990_A_20131007_063102_020926, Target: swa, Predicted: luo
|
| 747 |
+
Key: swa_88661_A_20130801_192922_039985, Target: swa, Predicted: kat
|
| 748 |
+
Key: swa_77990_A_20131007_063102_022115, Target: swa, Predicted: ibo
|
| 749 |
+
Key: swa_77990_A_20131007_063102_026741, Target: swa, Predicted: lao
|
| 750 |
+
Key: swa_77990_A_20131007_063102_027903, Target: swa, Predicted: hat
|
| 751 |
+
Key: swa_88661_B_20130801_192922_026940, Target: swa, Predicted: luo
|
| 752 |
+
Key: swa_77990_A_20131007_063102_038377, Target: swa, Predicted: gug
|
| 753 |
+
Key: swa_90080_B_20140319_222809_051417, Target: swa, Predicted: zul
|
| 754 |
+
Key: swa_92740_A_20130923_235638_046188, Target: swa, Predicted: luo
|
| 755 |
+
Key: swa_84177_A_20131208_021104_017433, Target: swa, Predicted: luo
|
| 756 |
+
Key: swa_84177_A_20131208_021104_023749, Target: swa, Predicted: hau
|
| 757 |
+
Key: swa_92740_A_20130923_235638_051565, Target: swa, Predicted: hat
|
| 758 |
+
Key: swa_92740_A_20130923_235638_056635, Target: swa, Predicted: amh
|
| 759 |
+
Key: swa_98311_A_20130109_195922_006959, Target: swa, Predicted: zul
|
| 760 |
+
Key: swa_98311_B_20130109_191639_008512, Target: swa, Predicted: zul
|
| 761 |
+
Key: swa_98311_B_20130109_191639_019516, Target: swa, Predicted: tpi
|
| 762 |
+
Key: swa_98311_B_20130109_191639_020611, Target: swa, Predicted: tgl
|
| 763 |
+
Key: tam_20682_B_20130209_174057_019472, Target: tam, Predicted: tel
|
| 764 |
+
Key: swa_98311_B_20130109_195922_013350, Target: swa, Predicted: luo
|
| 765 |
+
Key: tam_18924_A_20130224_150538_016506, Target: tam, Predicted: ben
|
| 766 |
+
Key: swa_98311_B_20130109_195922_029196, Target: swa, Predicted: ibo
|
| 767 |
+
Key: tam_26602_A_20130215_003413_056511, Target: tam, Predicted: tel
|
| 768 |
+
Key: tam_28606_A_20130126_221856_016645, Target: tam, Predicted: tel
|
| 769 |
+
Key: tam_32287_A_20130902_231135_036249, Target: tam, Predicted: ben
|
| 770 |
+
Key: tam_31624_A_20130107_221428_051356, Target: tam, Predicted: ben
|
| 771 |
+
Key: tam_31624_B_20130107_221428_000000, Target: tam, Predicted: tel
|
| 772 |
+
Key: tam_32287_A_20130902_231135_045702, Target: tam, Predicted: tur
|
| 773 |
+
Key: tam_28606_A_20130126_221856_035560, Target: tam, Predicted: asm
|
| 774 |
+
Key: tam_51701_A_20130312_022556_031090, Target: tam, Predicted: tel
|
| 775 |
+
Key: tam_55136_A_20130705_164312_053934, Target: tam, Predicted: tel
|
| 776 |
+
Key: tam_47451_A_20130210_010011_028292, Target: tam, Predicted: tel
|
| 777 |
+
Key: tam_57935_A_20130126_234131_007506, Target: tam, Predicted: tel
|
| 778 |
+
Key: tam_55136_A_20130705_164312_000000, Target: tam, Predicted: asm
|
| 779 |
+
Key: tam_55136_A_20130705_164312_007891, Target: tam, Predicted: kat
|
| 780 |
+
Key: tam_55136_A_20130705_164312_026691, Target: tam, Predicted: gug
|
| 781 |
+
Key: tam_59747_B_20121222_160946_052528, Target: tam, Predicted: tgl
|
| 782 |
+
Key: tam_63484_A_20130821_005511_000000, Target: tam, Predicted: tel
|
| 783 |
+
Key: tam_63484_A_20130821_005511_007534, Target: tam, Predicted: tel
|
| 784 |
+
Key: tam_59747_B_20121222_160946_003000, Target: tam, Predicted: tel
|
| 785 |
+
Key: tam_64902_B_20130215_191500_019463, Target: tam, Predicted: tel
|
| 786 |
+
Key: tam_78161_B_20130521_152635_032561, Target: tam, Predicted: mal
|
| 787 |
+
Key: tam_87074_A_20130107_181209_056155, Target: tam, Predicted: tel
|
| 788 |
+
Key: tam_91808_A_20130603_193623_033822, Target: tam, Predicted: spa
|
| 789 |
+
Key: tam_91808_A_20130603_193623_035009, Target: tam, Predicted: spa
|
| 790 |
+
Key: tam_91808_A_20130603_193623_036170, Target: tam, Predicted: spa
|
| 791 |
+
Key: tam_90937_B_20130516_224543_057733, Target: tam, Predicted: tur
|
| 792 |
+
Key: tam_91808_A_20130603_193623_038526, Target: tam, Predicted: spa
|
| 793 |
+
Key: tam_91808_A_20130603_193623_040739, Target: tam, Predicted: spa
|
| 794 |
+
Key: tam_91808_A_20130603_193623_044312, Target: tam, Predicted: spa
|
| 795 |
+
Key: tam_91808_A_20130603_193623_046627, Target: tam, Predicted: spa
|
| 796 |
+
Key: tam_91808_A_20130603_193623_047810, Target: tam, Predicted: luo
|
| 797 |
+
Key: tam_91808_A_20130603_193623_050178, Target: tam, Predicted: spa
|
| 798 |
+
Key: tam_90937_B_20130516_224543_000698, Target: tam, Predicted: tel
|
| 799 |
+
Key: tam_91808_A_20130603_193623_029099, Target: tam, Predicted: nno
|
| 800 |
+
Key: tel_22965_A_20131114_213605_007220, Target: tel, Predicted: mal
|
| 801 |
+
Key: tel_22965_A_20131114_213605_021162, Target: tel, Predicted: tur
|
| 802 |
+
Key: tel_21029_A_20131112_180205_050248, Target: tel, Predicted: tur
|
| 803 |
+
Key: tel_19703_A_20131114_213952_013187, Target: tel, Predicted: pan
|
| 804 |
+
Key: tel_19703_A_20131114_213952_025814, Target: tel, Predicted: gug
|
| 805 |
+
Key: tel_34336_B_20131114_162157_016722, Target: tel, Predicted: yue
|
| 806 |
+
Key: tel_46333_A_20131102_160049_011357, Target: tel, Predicted: tam
|
| 807 |
+
Key: tel_46702_A_20131023_225137_036937, Target: tel, Predicted: lao
|
| 808 |
+
Key: tel_46333_A_20131102_160049_044218, Target: tel, Predicted: tam
|
| 809 |
+
Key: tel_39848_A_20131113_195552_021978, Target: tel, Predicted: tam
|
| 810 |
+
Key: tel_46333_A_20131102_160049_050427, Target: tel, Predicted: tam
|
| 811 |
+
Key: tel_46333_A_20131102_160049_058406, Target: tel, Predicted: asm
|
| 812 |
+
Key: tel_49287_A_20131115_193114_004879, Target: tel, Predicted: tam
|
| 813 |
+
Key: tel_56720_B_20131122_215343_034540, Target: tel, Predicted: tam
|
| 814 |
+
Key: tel_61167_A_20131104_210455_048458, Target: tel, Predicted: yue
|
| 815 |
+
Key: tel_52854_A_20131105_013802_050825, Target: tel, Predicted: tam
|
| 816 |
+
Key: tel_58734_A_20131109_181122_003170, Target: tel, Predicted: khk
|
| 817 |
+
Key: tel_64759_A_20131104_195356_000000, Target: tel, Predicted: asm
|
| 818 |
+
Key: tel_65370_A_20140222_225324_021275, Target: tel, Predicted: tam
|
| 819 |
+
Key: tel_52854_A_20131105_013802_010802, Target: tel, Predicted: ben
|
| 820 |
+
Key: tel_86472_B_20131204_195705_020665, Target: tel, Predicted: asm
|
| 821 |
+
Key: tel_86472_B_20131204_195705_038616, Target: tel, Predicted: tam
|
| 822 |
+
Key: tel_74280_A_20131025_160420_021789, Target: tel, Predicted: tpi
|
| 823 |
+
Key: tel_75064_A_20131114_174949_038514, Target: tel, Predicted: gug
|
| 824 |
+
Key: tel_99487_A_20131027_195100_033800, Target: tel, Predicted: tam
|
| 825 |
+
Key: tel_99487_A_20131027_195100_039891, Target: tel, Predicted: ben
|
| 826 |
+
Key: tel_99487_A_20131027_195100_041660, Target: tel, Predicted: tam
|
| 827 |
+
Key: tel_99487_A_20131027_195100_050799, Target: tel, Predicted: asm
|
| 828 |
+
Key: tel_99487_A_20131027_195100_057242, Target: tel, Predicted: asm
|
| 829 |
+
Key: tel_99487_A_20131027_195100_002251, Target: tel, Predicted: asm
|
| 830 |
+
Key: tgl_16883_A_20120219_191154_047091, Target: tgl, Predicted: hat
|
| 831 |
+
Key: tel_99487_A_20131027_195100_017295, Target: tel, Predicted: tam
|
| 832 |
+
Key: tgl_25035_A_20120213_014750_039539, Target: tgl, Predicted: tur
|
| 833 |
+
Key: tgl_24379_A_20120303_015051_058579, Target: tgl, Predicted: tam
|
| 834 |
+
Key: tgl_42766_A_20120217_003639_055563, Target: tgl, Predicted: jav
|
| 835 |
+
Key: tgl_47845_A_20120405_122139_002633, Target: tgl, Predicted: tur
|
| 836 |
+
Key: tgl_35896_A_20120302_123550_002677, Target: tgl, Predicted: asm
|
| 837 |
+
Key: tgl_35896_A_20120302_123550_037263, Target: tgl, Predicted: jav
|
| 838 |
+
Key: tgl_42766_A_20120217_003639_003845, Target: tgl, Predicted: pus
|
| 839 |
+
Key: tgl_42766_A_20120217_003639_013190, Target: tgl, Predicted: ceb
|
| 840 |
+
Key: tgl_42766_A_20120217_003639_015529, Target: tgl, Predicted: jav
|
| 841 |
+
Key: tgl_42766_A_20120217_003639_018270, Target: tgl, Predicted: asm
|
| 842 |
+
Key: tgl_42766_A_20120217_003639_035803, Target: tgl, Predicted: hat
|
| 843 |
+
Key: tgl_42766_A_20120217_003639_037776, Target: tgl, Predicted: ceb
|
| 844 |
+
Key: tgl_42766_A_20120217_003639_040807, Target: tgl, Predicted: tur
|
| 845 |
+
Key: tgl_42766_A_20120217_003639_041997, Target: tgl, Predicted: ceb
|
| 846 |
+
Key: tgl_53982_A_20120224_233136_000808, Target: tgl, Predicted: hat
|
| 847 |
+
Key: tgl_53982_A_20120224_233136_057579, Target: tgl, Predicted: ceb
|
| 848 |
+
Key: tgl_53982_A_20120224_233136_004130, Target: tgl, Predicted: vie
|
| 849 |
+
Key: tgl_53982_A_20120224_233136_058755, Target: tgl, Predicted: ceb
|
| 850 |
+
Key: tgl_53982_A_20120224_233136_010994, Target: tgl, Predicted: ceb
|
| 851 |
+
Key: tgl_57422_B_20120227_015422_058809, Target: tgl, Predicted: ceb
|
| 852 |
+
Key: tgl_53982_B_20120224_233136_034869, Target: tgl, Predicted: kmr
|
| 853 |
+
Key: tgl_53982_A_20120224_233136_038175, Target: tgl, Predicted: swa
|
| 854 |
+
Key: tgl_53982_A_20120224_233136_042693, Target: tgl, Predicted: ceb
|
| 855 |
+
Key: tgl_53982_B_20120224_233136_050145, Target: tgl, Predicted: asm
|
| 856 |
+
Key: tgl_53982_A_20120224_233136_047272, Target: tgl, Predicted: hau
|
| 857 |
+
Key: tgl_53982_A_20120224_233136_050701, Target: tgl, Predicted: ceb
|
| 858 |
+
Key: tgl_57422_B_20120227_015422_005263, Target: tgl, Predicted: asm
|
| 859 |
+
Key: tgl_65580_B_20120221_210222_019328, Target: tgl, Predicted: tam
|
| 860 |
+
Key: tgl_66026_A_20120511_112437_000000, Target: tgl, Predicted: amh
|
| 861 |
+
Key: tgl_69050_B_20120203_173053_035312, Target: tgl, Predicted: kaz
|
| 862 |
+
Key: tgl_69050_B_20120203_173053_037579, Target: tgl, Predicted: ceb
|
| 863 |
+
Key: tgl_69050_B_20120203_173053_038725, Target: tgl, Predicted: tel
|
| 864 |
+
Key: tgl_81587_B_20120309_163209_015741, Target: tgl, Predicted: pus
|
| 865 |
+
Key: tgl_83891_A_20120327_163405_052916, Target: tgl, Predicted: zul
|
| 866 |
+
Key: tgl_83255_A_20120530_214353_011677, Target: tgl, Predicted: ceb
|
| 867 |
+
Key: tgl_79698_A_20120315_223952_001345, Target: tgl, Predicted: kaz
|
| 868 |
+
Key: tgl_85617_A_20120225_212818_053793, Target: tgl, Predicted: ceb
|
| 869 |
+
Key: tgl_83891_A_20120327_163405_025856, Target: tgl, Predicted: jav
|
| 870 |
+
Key: tgl_95589_B_20120225_032340_018516, Target: tgl, Predicted: ceb
|
| 871 |
+
Key: tgl_93000_B_20120227_164805_038142, Target: tgl, Predicted: jav
|
| 872 |
+
Key: tgl_93000_B_20120227_164805_050923, Target: tgl, Predicted: tpi
|
| 873 |
+
Key: tgl_93000_B_20120227_164805_054742, Target: tgl, Predicted: hat
|
| 874 |
+
Key: tpi_14440_A_20130824_153139_000000, Target: tpi, Predicted: yue
|
| 875 |
+
Key: tpi_14440_A_20130824_153139_003448, Target: tpi, Predicted: spa
|
| 876 |
+
Key: tpi_14440_A_20130824_153139_011131, Target: tpi, Predicted: luo
|
| 877 |
+
Key: tpi_14440_B_20130824_152406_002756, Target: tpi, Predicted: hat
|
| 878 |
+
Key: tpi_14440_B_20130824_153643_009173, Target: tpi, Predicted: tur
|
| 879 |
+
Key: tpi_14875_A_20130731_170626_024438, Target: tpi, Predicted: lao
|
| 880 |
+
Key: tpi_21244_A_20131010_122553_035642, Target: tpi, Predicted: kaz
|
| 881 |
+
Key: tpi_21244_A_20131010_122553_000000, Target: tpi, Predicted: tel
|
| 882 |
+
Key: tpi_21244_A_20131010_122553_004612, Target: tpi, Predicted: lit
|
| 883 |
+
Key: tpi_29911_A_20131212_174224_044795, Target: tpi, Predicted: luo
|
| 884 |
+
Key: tpi_32708_A_20130730_130556_000000, Target: tpi, Predicted: ben
|
| 885 |
+
Key: tpi_32708_B_20130730_130556_018569, Target: tpi, Predicted: tur
|
| 886 |
+
Key: tpi_32708_B_20130730_130556_040242, Target: tpi, Predicted: tel
|
| 887 |
+
Key: tpi_32708_B_20130730_130556_044557, Target: tpi, Predicted: ibo
|
| 888 |
+
Key: tpi_32708_B_20130730_130556_056565, Target: tpi, Predicted: asm
|
| 889 |
+
Key: tpi_46535_A_20131219_223648_000000, Target: tpi, Predicted: ceb
|
| 890 |
+
Key: tpi_46535_A_20131219_223648_002331, Target: tpi, Predicted: ceb
|
| 891 |
+
Key: tpi_33175_B_20130621_162225_012734, Target: tpi, Predicted: kmr
|
| 892 |
+
Key: tpi_46535_A_20131219_223648_004702, Target: tpi, Predicted: ceb
|
| 893 |
+
Key: tpi_46535_A_20131219_223648_009377, Target: tpi, Predicted: ceb
|
| 894 |
+
Key: tpi_46535_A_20131219_223648_011760, Target: tpi, Predicted: kaz
|
| 895 |
+
Key: tpi_46535_A_20131219_223648_019923, Target: tpi, Predicted: kaz
|
| 896 |
+
Key: tpi_46535_A_20131219_223648_021109, Target: tpi, Predicted: kaz
|
| 897 |
+
Key: tpi_46535_A_20131219_223648_023410, Target: tpi, Predicted: lit
|
| 898 |
+
Key: tpi_46535_A_20131219_223648_034620, Target: tpi, Predicted: kaz
|
| 899 |
+
Key: tpi_46535_A_20131219_223648_035772, Target: tpi, Predicted: kaz
|
| 900 |
+
Key: tpi_46535_A_20131219_223648_036968, Target: tpi, Predicted: lit
|
| 901 |
+
Key: tpi_46535_A_20131219_223648_040468, Target: tpi, Predicted: kaz
|
| 902 |
+
Key: tpi_46535_A_20131219_223648_043973, Target: tpi, Predicted: kaz
|
| 903 |
+
Key: tpi_46535_A_20131219_223648_045145, Target: tpi, Predicted: lit
|
| 904 |
+
Key: tpi_46535_A_20131219_223648_046327, Target: tpi, Predicted: kaz
|
| 905 |
+
Key: tpi_46535_A_20131219_223648_048678, Target: tpi, Predicted: kaz
|
| 906 |
+
Key: tpi_46535_A_20131219_223648_055715, Target: tpi, Predicted: khk
|
| 907 |
+
Key: tpi_61963_A_20130830_141616_048803, Target: tpi, Predicted: ceb
|
| 908 |
+
Key: tpi_67213_A_20131218_185924_000007, Target: tpi, Predicted: tam
|
| 909 |
+
Key: tpi_65252_A_20131008_183014_027781, Target: tpi, Predicted: khk
|
| 910 |
+
Key: tpi_61963_A_20130830_141616_014582, Target: tpi, Predicted: tgl
|
| 911 |
+
Key: tpi_61963_A_20130830_141616_016809, Target: tpi, Predicted: ceb
|
| 912 |
+
Key: tpi_61963_A_20130830_141616_056934, Target: tpi, Predicted: yue
|
| 913 |
+
Key: tpi_67213_A_20131218_185924_009958, Target: tpi, Predicted: lao
|
| 914 |
+
Key: tpi_67213_A_20131218_185924_013351, Target: tpi, Predicted: kaz
|
| 915 |
+
Key: tpi_61963_A_20130830_141616_021279, Target: tpi, Predicted: ceb
|
| 916 |
+
Key: tpi_61963_A_20130830_141616_023535, Target: tpi, Predicted: kaz
|
| 917 |
+
Key: tpi_61963_A_20130830_141616_024661, Target: tpi, Predicted: kaz
|
| 918 |
+
Key: tpi_67213_A_20131218_185924_024779, Target: tpi, Predicted: kaz
|
| 919 |
+
Key: tpi_67213_A_20131218_185924_027069, Target: tpi, Predicted: kaz
|
| 920 |
+
Key: tpi_61963_A_20130830_141616_029312, Target: tpi, Predicted: tel
|
| 921 |
+
Key: tpi_65252_A_20131008_183014_049192, Target: tpi, Predicted: kaz
|
| 922 |
+
Key: tpi_65252_A_20131008_183014_005780, Target: tpi, Predicted: ceb
|
| 923 |
+
Key: tpi_67213_A_20131218_185924_030537, Target: tpi, Predicted: kaz
|
| 924 |
+
Key: tpi_67213_A_20131218_185924_031728, Target: tpi, Predicted: lao
|
| 925 |
+
Key: tpi_65252_A_20131008_183014_009092, Target: tpi, Predicted: kaz
|
| 926 |
+
Key: tpi_65252_A_20131008_183014_052661, Target: tpi, Predicted: kaz
|
| 927 |
+
Key: tpi_67213_A_20131218_185924_032851, Target: tpi, Predicted: kaz
|
| 928 |
+
Key: tpi_65252_A_20131008_183014_055433, Target: tpi, Predicted: kaz
|
| 929 |
+
Key: tpi_67213_A_20131218_185924_035174, Target: tpi, Predicted: lao
|
| 930 |
+
Key: tpi_65252_A_20131008_183014_057351, Target: tpi, Predicted: kaz
|
| 931 |
+
Key: tpi_61963_A_20130830_141616_040718, Target: tpi, Predicted: kaz
|
| 932 |
+
Key: tpi_65252_A_20131008_183014_016004, Target: tpi, Predicted: kaz
|
| 933 |
+
Key: tpi_67213_A_20131218_185924_040830, Target: tpi, Predicted: kaz
|
| 934 |
+
Key: tpi_61963_A_20130830_141616_043068, Target: tpi, Predicted: kaz
|
| 935 |
+
Key: tpi_67213_A_20131218_185924_042022, Target: tpi, Predicted: lit
|
| 936 |
+
Key: tpi_65252_A_20131008_183014_019897, Target: tpi, Predicted: lit
|
| 937 |
+
Key: tpi_67213_A_20131218_185924_043218, Target: tpi, Predicted: kaz
|
| 938 |
+
Key: tpi_65252_A_20131008_183014_021073, Target: tpi, Predicted: ceb
|
| 939 |
+
Key: tpi_67213_A_20131218_185924_044365, Target: tpi, Predicted: lao
|
| 940 |
+
Key: tpi_74226_B_20130828_115915_013376, Target: tpi, Predicted: ces
|
| 941 |
+
Key: tpi_67213_A_20131218_185924_046992, Target: tpi, Predicted: kaz
|
| 942 |
+
Key: tpi_67213_A_20131218_185924_048146, Target: tpi, Predicted: tel
|
| 943 |
+
Key: tpi_74226_B_20130828_115915_021609, Target: tpi, Predicted: tam
|
| 944 |
+
Key: tpi_67213_A_20131218_185924_051355, Target: tpi, Predicted: asm
|
| 945 |
+
Key: tpi_74226_B_20130828_115915_022808, Target: tpi, Predicted: tel
|
| 946 |
+
Key: tpi_67213_A_20131218_185924_052511, Target: tpi, Predicted: kaz
|
| 947 |
+
Key: tpi_67213_A_20131218_185924_054773, Target: tpi, Predicted: ceb
|
| 948 |
+
Key: tpi_74226_B_20130828_115915_028173, Target: tpi, Predicted: luo
|
| 949 |
+
Key: tpi_67213_B_20131218_185924_004053, Target: tpi, Predicted: kat
|
| 950 |
+
Key: tpi_74226_B_20130828_115915_032801, Target: tpi, Predicted: luo
|
| 951 |
+
Key: tpi_67213_B_20131218_185924_022830, Target: tpi, Predicted: tam
|
| 952 |
+
Key: tpi_67213_B_20131218_185924_048209, Target: tpi, Predicted: khk
|
| 953 |
+
Key: tpi_67213_B_20131218_185924_057759, Target: tpi, Predicted: ben
|
| 954 |
+
Key: tpi_70726_A_20131222_161540_019387, Target: tpi, Predicted: khk
|
| 955 |
+
Key: tpi_70726_A_20131222_161540_023625, Target: tpi, Predicted: lao
|
| 956 |
+
Key: tpi_74226_B_20130828_115915_005432, Target: tpi, Predicted: asm
|
| 957 |
+
Key: tpi_70726_A_20131222_161540_024767, Target: tpi, Predicted: lit
|
| 958 |
+
Key: tpi_74226_B_20130828_115915_006573, Target: tpi, Predicted: tur
|
| 959 |
+
Key: tpi_74226_B_20130828_115915_010205, Target: tpi, Predicted: tur
|
| 960 |
+
Key: tpi_76837_A_20131207_184347_043285, Target: tpi, Predicted: lit
|
| 961 |
+
Key: tpi_85179_B_20130920_130213_039435, Target: tpi, Predicted: hat
|
| 962 |
+
Key: tpi_90777_B_20130725_111134_034620, Target: tpi, Predicted: zul
|
| 963 |
+
Key: tpi_80577_B_20130930_204532_000000, Target: tpi, Predicted: kaz
|
| 964 |
+
Key: tpi_80577_B_20130930_204532_004421, Target: tpi, Predicted: kaz
|
| 965 |
+
Key: tpi_80577_B_20130930_204532_010349, Target: tpi, Predicted: khk
|
| 966 |
+
Key: tpi_80577_B_20130930_204532_023950, Target: tpi, Predicted: tel
|
| 967 |
+
Key: tpi_80577_B_20130930_204532_037734, Target: tpi, Predicted: kat
|
| 968 |
+
Key: tpi_92886_B_20130711_144627_042334, Target: tpi, Predicted: tel
|
| 969 |
+
Key: tur_21541_A_20120518_012528_006278, Target: tur, Predicted: kmr
|
| 970 |
+
Key: tur_21541_A_20120518_012528_014473, Target: tur, Predicted: kmr
|
| 971 |
+
Key: tur_21541_A_20120518_012528_018664, Target: tur, Predicted: kmr
|
| 972 |
+
Key: tur_21541_A_20120518_012528_031124, Target: tur, Predicted: kmr
|
| 973 |
+
Key: tur_11521_A_20120602_034839_041086, Target: tur, Predicted: ceb
|
| 974 |
+
Key: tur_11521_A_20120602_034839_049327, Target: tur, Predicted: kmr
|
| 975 |
+
Key: tur_32236_A_20120516_221311_019954, Target: tur, Predicted: zul
|
| 976 |
+
Key: tur_39963_A_20120209_083935_000000, Target: tur, Predicted: tgl
|
| 977 |
+
Key: tur_31256_A_20120531_015506_021282, Target: tur, Predicted: kmr
|
| 978 |
+
Key: tur_44023_A_20120530_220359_022785, Target: tur, Predicted: kmr
|
| 979 |
+
Key: tur_76372_B_20120709_015738_018584, Target: tur, Predicted: kmr
|
| 980 |
+
Key: vie_12963_B_20120509_003852_003025, Target: vie, Predicted: lao
|
| 981 |
+
Key: vie_11031_B_20120617_182613_013283, Target: vie, Predicted: gug
|
| 982 |
+
Key: vie_14769_B_20120420_013147_027012, Target: vie, Predicted: hat
|
| 983 |
+
Key: vie_14769_A_20120420_013147_000233, Target: vie, Predicted: asm
|
| 984 |
+
Key: vie_32236_B_20120505_195420_013203, Target: vie, Predicted: lao
|
| 985 |
+
Key: vie_32236_B_20120505_195420_021428, Target: vie, Predicted: tel
|
| 986 |
+
Key: vie_31538_A_20120320_202748_018919, Target: vie, Predicted: gug
|
| 987 |
+
Key: vie_31538_A_20120320_202748_020100, Target: vie, Predicted: tpi
|
| 988 |
+
Key: vie_31538_A_20120320_202748_021281, Target: vie, Predicted: hat
|
| 989 |
+
Key: vie_32236_B_20120505_195420_035087, Target: vie, Predicted: lao
|
| 990 |
+
Key: vie_32236_B_20120505_195420_039604, Target: vie, Predicted: lao
|
| 991 |
+
Key: vie_31538_A_20120320_202748_028672, Target: vie, Predicted: jav
|
| 992 |
+
Key: vie_31538_A_20120320_202748_031014, Target: vie, Predicted: hat
|
| 993 |
+
Key: vie_32236_B_20120505_195420_053886, Target: vie, Predicted: lao
|
| 994 |
+
Key: vie_32236_B_20120505_195420_058774, Target: vie, Predicted: lao
|
| 995 |
+
Key: vie_31538_A_20120320_202748_036046, Target: vie, Predicted: ibo
|
| 996 |
+
Key: vie_31538_A_20120320_202748_038410, Target: vie, Predicted: tam
|
| 997 |
+
Key: vie_31538_A_20120320_202748_040232, Target: vie, Predicted: gug
|
| 998 |
+
Key: vie_31538_A_20120320_202748_003435, Target: vie, Predicted: por
|
| 999 |
+
Key: vie_31538_A_20120320_202748_051509, Target: vie, Predicted: gug
|
| 1000 |
+
Key: vie_31538_A_20120320_202748_008124, Target: vie, Predicted: ibo
|
| 1001 |
+
Key: vie_31538_A_20120320_202748_010955, Target: vie, Predicted: gug
|
| 1002 |
+
Key: vie_31538_A_20120320_202748_053819, Target: vie, Predicted: hat
|
| 1003 |
+
Key: vie_31538_A_20120320_202748_012151, Target: vie, Predicted: hat
|
| 1004 |
+
Key: vie_35391_A_20120416_192241_046900, Target: vie, Predicted: gug
|
| 1005 |
+
Key: vie_35391_A_20120416_192241_054357, Target: vie, Predicted: jav
|
| 1006 |
+
Key: vie_45512_A_20120505_135144_053538, Target: vie, Predicted: hat
|
| 1007 |
+
Key: vie_45512_A_20120505_135144_004505, Target: vie, Predicted: ceb
|
| 1008 |
+
Key: vie_45512_A_20120505_135144_011913, Target: vie, Predicted: ibo
|
| 1009 |
+
Key: vie_63459_B_20120415_003841_021302, Target: vie, Predicted: kaz
|
| 1010 |
+
Key: vie_63459_B_20120415_003841_029357, Target: vie, Predicted: ceb
|
| 1011 |
+
Key: vie_79526_A_20120420_150504_017293, Target: vie, Predicted: gug
|
| 1012 |
+
Key: vie_85204_A_20120212_190017_002132, Target: vie, Predicted: gug
|
| 1013 |
+
Key: vie_77771_B_20120421_231323_012583, Target: vie, Predicted: tur
|
| 1014 |
+
Key: vie_85204_A_20120212_190017_028994, Target: vie, Predicted: tgl
|
| 1015 |
+
Key: vie_90202_A_20120502_194459_035522, Target: vie, Predicted: hat
|
| 1016 |
+
Key: vie_90202_A_20120502_194459_004459, Target: vie, Predicted: swa
|
| 1017 |
+
Key: vie_90202_A_20120502_194459_045611, Target: vie, Predicted: hat
|
| 1018 |
+
Key: vie_90202_A_20120502_194459_050828, Target: vie, Predicted: lao
|
| 1019 |
+
Key: vie_90202_A_20120502_194459_024874, Target: vie, Predicted: swa
|
| 1020 |
+
Key: vie_92386_A_20120322_195456_024899, Target: vie, Predicted: lao
|
| 1021 |
+
Key: vie_92386_A_20120322_195456_031300, Target: vie, Predicted: lao
|
| 1022 |
+
Key: zul_22466_B_20121130_231814_007273, Target: zul, Predicted: tgl
|
| 1023 |
+
Key: zul_28190_A_20121213_031401_032444, Target: zul, Predicted: asm
|
| 1024 |
+
Key: zul_35583_B_20130529_005600_015367, Target: zul, Predicted: ibo
|
| 1025 |
+
Key: zul_35583_B_20130529_005600_036873, Target: zul, Predicted: ibo
|
| 1026 |
+
Key: zul_35583_B_20130529_005600_053939, Target: zul, Predicted: amh
|
| 1027 |
+
Key: zul_43646_B_20121206_213819_000000, Target: zul, Predicted: asm
|
| 1028 |
+
Key: zul_42600_A_20121206_212006_003728, Target: zul, Predicted: tgl
|
| 1029 |
+
Key: zul_41100_A_20121129_003855_000513, Target: zul, Predicted: swa
|
| 1030 |
+
Key: zul_56198_A_20121128_190457_008384, Target: zul, Predicted: swa
|
| 1031 |
+
Key: zul_56198_A_20121128_190457_010740, Target: zul, Predicted: amh
|
| 1032 |
+
Key: zul_56198_A_20121128_190457_055361, Target: zul, Predicted: hat
|
| 1033 |
+
Key: zul_79858_B_20121126_013705_033065, Target: zul, Predicted: tgl
|
| 1034 |
+
Key: zul_82224_A_20130602_234038_044542, Target: zul, Predicted: ibo
|
| 1035 |
+
Key: zul_82224_A_20130602_234038_048706, Target: zul, Predicted: ssw
|
| 1036 |
+
Key: zul_84838_B_20121210_051040_025030, Target: zul, Predicted: jav
|
| 1037 |
+
Key: zul_93007_A_20130528_211314_002393, Target: zul, Predicted: luo
|
| 1038 |
+
Key: zul_93007_A_20130528_211314_017504, Target: zul, Predicted: sna
|
| 1039 |
+
Key: zul_93007_A_20130528_211314_057839, Target: zul, Predicted: sna
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/lid_inference_test.log
ADDED
|
@@ -0,0 +1,286 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 02:33:14 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 02:33:33,758 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 02:33:45,800 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 02:33:46,351 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 02:34:18,099 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 02:34:42,593 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 1
|
| 278 |
+
[gpue04] 2025-06-02 02:35:05,422 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 2
|
| 279 |
+
[gpue04] 2025-06-02 02:35:28,092 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 3
|
| 280 |
+
[gpue04] 2025-06-02 02:35:52,743 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 4
|
| 281 |
+
[gpue04] 2025-06-02 02:36:39,290 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 5
|
| 282 |
+
[gpue04] 2025-06-02 02:37:10,073 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 6
|
| 283 |
+
[gpue04] 2025-06-02 02:37:13,207 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 284 |
+
[gpue04] 2025-06-02 02:37:13,208 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 285 |
+
# Accounting: time=240 threads=1
|
| 286 |
+
# Ended (code 0) at Mon Jun 2 02:37:14 CDT 2025, elapsed time 240 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_dialect_ml_superb2_lang_cross_train_all_no_filter_lang/results
ADDED
|
@@ -0,0 +1,946 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Accuracy: 86.82%
|
| 2 |
+
Macro Accuracy: 86.92%
|
| 3 |
+
Accuracy per Language:
|
| 4 |
+
tam: 100.00%
|
| 5 |
+
guj: 97.89%
|
| 6 |
+
ell: 71.37%
|
| 7 |
+
eng: 93.57%
|
| 8 |
+
deu: 74.00%
|
| 9 |
+
tel: 99.01%
|
| 10 |
+
spa: 95.19%
|
| 11 |
+
ara: 64.31%
|
| 12 |
+
Key: ara_sada_acw_000000, Target: ara, Predicted: heb
|
| 13 |
+
Key: ara_sada_acw_000032, Target: ara, Predicted: mon
|
| 14 |
+
Key: ara_sada_acw_000064, Target: ara, Predicted: slv
|
| 15 |
+
Key: ara_sada_acw_000096, Target: ara, Predicted: fao
|
| 16 |
+
Key: ara_sada_acw_000001, Target: ara, Predicted: sot
|
| 17 |
+
Key: ara_sada_acw_000033, Target: ara, Predicted: pus
|
| 18 |
+
Key: ara_sada_acw_000065, Target: ara, Predicted: fin
|
| 19 |
+
Key: ara_sada_acw_000067, Target: ara, Predicted: sqi
|
| 20 |
+
Key: ara_sada_acw_000099, Target: ara, Predicted: amh
|
| 21 |
+
Key: ara_sada_acw_000004, Target: ara, Predicted: tuk
|
| 22 |
+
Key: ara_sada_acw_000005, Target: ara, Predicted: mon
|
| 23 |
+
Key: ara_sada_acw_000069, Target: ara, Predicted: slv
|
| 24 |
+
Key: ara_sada_acw_000101, Target: ara, Predicted: cmn
|
| 25 |
+
Key: ara_sada_acw_000006, Target: ara, Predicted: eng
|
| 26 |
+
Key: ara_sada_acw_000038, Target: ara, Predicted: sqi
|
| 27 |
+
Key: ara_sada_acw_000071, Target: ara, Predicted: khm
|
| 28 |
+
Key: ara_sada_acw_000103, Target: ara, Predicted: sqi
|
| 29 |
+
Key: ara_sada_acw_000009, Target: ara, Predicted: som
|
| 30 |
+
Key: ara_sada_acw_000041, Target: ara, Predicted: aze
|
| 31 |
+
Key: ara_sada_acw_000042, Target: ara, Predicted: aze
|
| 32 |
+
Key: ara_sada_acw_000106, Target: ara, Predicted: kaz
|
| 33 |
+
Key: ara_sada_acw_000011, Target: ara, Predicted: ces
|
| 34 |
+
Key: ara_sada_acw_000043, Target: ara, Predicted: pus
|
| 35 |
+
Key: ara_sada_acw_000075, Target: ara, Predicted: cmn
|
| 36 |
+
Key: ara_sada_acw_000044, Target: ara, Predicted: kaz
|
| 37 |
+
Key: ara_sada_acw_000045, Target: ara, Predicted: fas
|
| 38 |
+
Key: ara_sada_acw_000077, Target: ara, Predicted: deu
|
| 39 |
+
Key: ara_sada_acw_000014, Target: ara, Predicted: hat
|
| 40 |
+
Key: ara_sada_acw_000046, Target: ara, Predicted: som
|
| 41 |
+
Key: ara_sada_acw_000078, Target: ara, Predicted: spa
|
| 42 |
+
Key: ara_sada_acw_000110, Target: ara, Predicted: lin
|
| 43 |
+
Key: ara_sada_acw_000079, Target: ara, Predicted: aze
|
| 44 |
+
Key: ara_sada_acw_000111, Target: ara, Predicted: cym
|
| 45 |
+
Key: ara_sada_acw_000016, Target: ara, Predicted: hrv
|
| 46 |
+
Key: ara_sada_acw_000048, Target: ara, Predicted: eng
|
| 47 |
+
Key: ara_sada_acw_000080, Target: ara, Predicted: ces
|
| 48 |
+
Key: ara_sada_acw_000083, Target: ara, Predicted: jav
|
| 49 |
+
Key: ara_sada_acw_000052, Target: ara, Predicted: mlt
|
| 50 |
+
Key: ara_sada_acw_000116, Target: ara, Predicted: cym
|
| 51 |
+
Key: ara_sada_acw_000053, Target: ara, Predicted: heb
|
| 52 |
+
Key: ara_sada_acw_000117, Target: ara, Predicted: ben
|
| 53 |
+
Key: ara_sada_acw_000118, Target: ara, Predicted: slv
|
| 54 |
+
Key: ara_sada_acw_000023, Target: ara, Predicted: mon
|
| 55 |
+
Key: ara_sada_acw_000056, Target: ara, Predicted: heb
|
| 56 |
+
Key: ara_sada_acw_000120, Target: ara, Predicted: yid
|
| 57 |
+
Key: ara_sada_acw_000025, Target: ara, Predicted: snd
|
| 58 |
+
Key: ara_sada_acw_000057, Target: ara, Predicted: fra
|
| 59 |
+
Key: ara_sada_acw_000124, Target: ara, Predicted: lit
|
| 60 |
+
Key: ara_sada_acw_000029, Target: ara, Predicted: sin
|
| 61 |
+
Key: ara_sada_acw_000125, Target: ara, Predicted: yid
|
| 62 |
+
Key: ara_sada_acw_000094, Target: ara, Predicted: tat
|
| 63 |
+
Key: ara_sada_acw_000063, Target: ara, Predicted: amh
|
| 64 |
+
Key: ara_sada_acw_000127, Target: ara, Predicted: xty
|
| 65 |
+
Key: ara_sada_acw_000129, Target: ara, Predicted: pol
|
| 66 |
+
Key: ara_sada_afb_000018, Target: ara, Predicted: afr
|
| 67 |
+
Key: ara_sada_acw_000130, Target: ara, Predicted: war
|
| 68 |
+
Key: ara_sada_acw_000162, Target: ara, Predicted: ell
|
| 69 |
+
Key: ara_sada_acw_000131, Target: ara, Predicted: guj
|
| 70 |
+
Key: ara_sada_acw_000163, Target: ara, Predicted: sna
|
| 71 |
+
Key: ara_sada_acw_000132, Target: ara, Predicted: bre
|
| 72 |
+
Key: ara_sada_acw_000164, Target: ara, Predicted: mlt
|
| 73 |
+
Key: ara_sada_afb_000021, Target: ara, Predicted: ell
|
| 74 |
+
Key: ara_sada_afb_000054, Target: ara, Predicted: nld
|
| 75 |
+
Key: ara_sada_afb_000023, Target: ara, Predicted: heb
|
| 76 |
+
Key: ara_sada_acw_000135, Target: ara, Predicted: som
|
| 77 |
+
Key: ara_sada_acw_000167, Target: ara, Predicted: isl
|
| 78 |
+
Key: ara_sada_afb_000025, Target: ara, Predicted: mon
|
| 79 |
+
Key: ara_sada_afb_000057, Target: ara, Predicted: sqi
|
| 80 |
+
Key: ara_sada_acw_000169, Target: ara, Predicted: deu
|
| 81 |
+
Key: ara_sada_afb_000058, Target: ara, Predicted: hau
|
| 82 |
+
Key: ara_sada_acw_000170, Target: ara, Predicted: mon
|
| 83 |
+
Key: ara_sada_acw_000139, Target: ara, Predicted: azz
|
| 84 |
+
Key: ara_sada_acw_000171, Target: ara, Predicted: heb
|
| 85 |
+
Key: ara_sada_afb_000028, Target: ara, Predicted: amh
|
| 86 |
+
Key: ara_sada_acw_000140, Target: ara, Predicted: mon
|
| 87 |
+
Key: ara_sada_acw_000173, Target: ara, Predicted: tat
|
| 88 |
+
Key: ara_sada_afb_000032, Target: ara, Predicted: tuk
|
| 89 |
+
Key: ara_sada_afb_000002, Target: ara, Predicted: rus
|
| 90 |
+
Key: ara_sada_afb_000034, Target: ara, Predicted: ltz
|
| 91 |
+
Key: ara_sada_afb_000003, Target: ara, Predicted: hrv
|
| 92 |
+
Key: ara_sada_afb_000035, Target: ara, Predicted: som
|
| 93 |
+
Key: ara_sada_acw_000147, Target: ara, Predicted: mlt
|
| 94 |
+
Key: ara_sada_afb_000068, Target: ara, Predicted: fas
|
| 95 |
+
Key: ara_sada_afb_000037, Target: ara, Predicted: abk
|
| 96 |
+
Key: ara_sada_afb_000069, Target: ara, Predicted: hau
|
| 97 |
+
Key: ara_sada_acw_000149, Target: ara, Predicted: aze
|
| 98 |
+
Key: ara_sada_afb_000070, Target: ara, Predicted: pus
|
| 99 |
+
Key: ara_sada_afb_000007, Target: ara, Predicted: pus
|
| 100 |
+
Key: ara_sada_afb_000039, Target: ara, Predicted: heb
|
| 101 |
+
Key: ara_sada_afb_000071, Target: ara, Predicted: heb
|
| 102 |
+
Key: ara_sada_acw_000151, Target: ara, Predicted: heb
|
| 103 |
+
Key: ara_sada_afb_000008, Target: ara, Predicted: heb
|
| 104 |
+
Key: ara_sada_afb_000072, Target: ara, Predicted: cym
|
| 105 |
+
Key: ara_sada_afb_000009, Target: ara, Predicted: tat
|
| 106 |
+
Key: ara_sada_afb_000073, Target: ara, Predicted: mya
|
| 107 |
+
Key: ara_sada_acw_000153, Target: ara, Predicted: fra
|
| 108 |
+
Key: ara_sada_afb_000010, Target: ara, Predicted: heb
|
| 109 |
+
Key: ara_sada_afb_000042, Target: ara, Predicted: yid
|
| 110 |
+
Key: ara_sada_acw_000154, Target: ara, Predicted: som
|
| 111 |
+
Key: ara_sada_afb_000011, Target: ara, Predicted: nep
|
| 112 |
+
Key: ara_sada_afb_000043, Target: ara, Predicted: mlt
|
| 113 |
+
Key: ara_sada_afb_000075, Target: ara, Predicted: abk
|
| 114 |
+
Key: ara_sada_afb_000046, Target: ara, Predicted: nep
|
| 115 |
+
Key: ara_sada_afb_000016, Target: ara, Predicted: amh
|
| 116 |
+
Key: ara_sada_afb_000080, Target: ara, Predicted: nep
|
| 117 |
+
Key: ara_sada_afb_000178, Target: ara, Predicted: som
|
| 118 |
+
Key: ara_sada_afb_000083, Target: ara, Predicted: aze
|
| 119 |
+
Key: ara_sada_afb_000115, Target: ara, Predicted: som
|
| 120 |
+
Key: ara_sada_afb_000147, Target: ara, Predicted: fas
|
| 121 |
+
Key: ara_sada_afb_000084, Target: ara, Predicted: fra
|
| 122 |
+
Key: ara_sada_afb_000148, Target: ara, Predicted: fra
|
| 123 |
+
Key: ara_sada_afb_000085, Target: ara, Predicted: eng
|
| 124 |
+
Key: ara_sada_afb_000117, Target: ara, Predicted: glv
|
| 125 |
+
Key: ara_sada_afb_000181, Target: ara, Predicted: spa
|
| 126 |
+
Key: ara_sada_afb_000118, Target: ara, Predicted: guj
|
| 127 |
+
Key: ara_sada_afb_000119, Target: ara, Predicted: hun
|
| 128 |
+
Key: ara_sada_afb_000151, Target: ara, Predicted: fas
|
| 129 |
+
Key: ara_sada_afb_000183, Target: ara, Predicted: deu
|
| 130 |
+
Key: ara_sada_afb_000088, Target: ara, Predicted: swa
|
| 131 |
+
Key: ara_sada_afb_000184, Target: ara, Predicted: som
|
| 132 |
+
Key: ara_sada_afb_000121, Target: ara, Predicted: slv
|
| 133 |
+
Key: ara_sada_afb_000091, Target: ara, Predicted: tat
|
| 134 |
+
Key: ara_sada_afb_000155, Target: ara, Predicted: mya
|
| 135 |
+
Key: ara_sada_afb_000124, Target: ara, Predicted: mya
|
| 136 |
+
Key: ara_sada_afb_000156, Target: ara, Predicted: amh
|
| 137 |
+
Key: ara_sada_afb_000188, Target: ara, Predicted: mon
|
| 138 |
+
Key: ara_sada_afb_000094, Target: ara, Predicted: fra
|
| 139 |
+
Key: ara_sada_afb_000126, Target: ara, Predicted: heb
|
| 140 |
+
Key: ara_sada_afb_000190, Target: ara, Predicted: yor
|
| 141 |
+
Key: ara_sada_afb_000159, Target: ara, Predicted: heb
|
| 142 |
+
Key: ara_sada_afb_000160, Target: ara, Predicted: heb
|
| 143 |
+
Key: ara_sada_afb_000192, Target: ara, Predicted: tat
|
| 144 |
+
Key: ara_sada_afb_000129, Target: ara, Predicted: hau
|
| 145 |
+
Key: ara_sada_afb_000161, Target: ara, Predicted: som
|
| 146 |
+
Key: ara_sada_afb_000193, Target: ara, Predicted: hrv
|
| 147 |
+
Key: ara_sada_afb_000130, Target: ara, Predicted: deu
|
| 148 |
+
Key: ara_sada_afb_000162, Target: ara, Predicted: bre
|
| 149 |
+
Key: ara_sada_afb_000099, Target: ara, Predicted: afr
|
| 150 |
+
Key: ara_sada_afb_000131, Target: ara, Predicted: nld
|
| 151 |
+
Key: ara_sada_afb_000163, Target: ara, Predicted: sna
|
| 152 |
+
Key: ara_sada_afb_000100, Target: ara, Predicted: nep
|
| 153 |
+
Key: ara_sada_afb_000132, Target: ara, Predicted: bre
|
| 154 |
+
Key: ara_sada_afb_000164, Target: ara, Predicted: bod
|
| 155 |
+
Key: ara_sada_afb_000196, Target: ara, Predicted: uzb
|
| 156 |
+
Key: ara_sada_afb_000165, Target: ara, Predicted: heb
|
| 157 |
+
Key: ara_sada_afb_000197, Target: ara, Predicted: ben
|
| 158 |
+
Key: ara_sada_afb_000134, Target: ara, Predicted: eng
|
| 159 |
+
Key: ara_sada_afb_000166, Target: ara, Predicted: tgk
|
| 160 |
+
Key: ara_sada_afb_000167, Target: ara, Predicted: nso
|
| 161 |
+
Key: ara_sada_afb_000104, Target: ara, Predicted: heb
|
| 162 |
+
Key: ara_sada_afb_000136, Target: ara, Predicted: slv
|
| 163 |
+
Key: ara_sada_afb_000168, Target: ara, Predicted: swa
|
| 164 |
+
Key: ara_sada_afb_000137, Target: ara, Predicted: bod
|
| 165 |
+
Key: ara_sada_afb_000106, Target: ara, Predicted: eng
|
| 166 |
+
Key: ara_sada_afb_000138, Target: ara, Predicted: tat
|
| 167 |
+
Key: ara_sada_afb_000108, Target: ara, Predicted: eus
|
| 168 |
+
Key: ara_sada_afb_000140, Target: ara, Predicted: tat
|
| 169 |
+
Key: ara_sada_afb_000141, Target: ara, Predicted: eng
|
| 170 |
+
Key: ara_sada_afb_000173, Target: ara, Predicted: kat
|
| 171 |
+
Key: ara_sada_afb_000142, Target: ara, Predicted: nld
|
| 172 |
+
Key: ara_sada_afb_000208, Target: ara, Predicted: kan
|
| 173 |
+
Key: ara_sada_ars_000007, Target: ara, Predicted: amh
|
| 174 |
+
Key: ara_sada_ars_000039, Target: ara, Predicted: fra
|
| 175 |
+
Key: ara_sada_ars_000008, Target: ara, Predicted: som
|
| 176 |
+
Key: ara_sada_ars_000042, Target: ara, Predicted: ces
|
| 177 |
+
Key: ara_sada_ars_000043, Target: ara, Predicted: lit
|
| 178 |
+
Key: ara_sada_ars_000044, Target: ara, Predicted: hrv
|
| 179 |
+
Key: ara_sada_afb_000216, Target: ara, Predicted: heb
|
| 180 |
+
Key: ara_sada_ars_000016, Target: ara, Predicted: grn
|
| 181 |
+
Key: ara_sada_ars_000048, Target: ara, Predicted: som
|
| 182 |
+
Key: ara_sada_ars_000049, Target: ara, Predicted: heb
|
| 183 |
+
Key: ara_sada_afb_000220, Target: ara, Predicted: hye
|
| 184 |
+
Key: ara_sada_ars_000018, Target: ara, Predicted: bre
|
| 185 |
+
Key: ara_sada_afb_000221, Target: ara, Predicted: azz
|
| 186 |
+
Key: ara_sada_arb_000031, Target: ara, Predicted: yor
|
| 187 |
+
Key: ara_sada_ars_000019, Target: ara, Predicted: nld
|
| 188 |
+
Key: ara_sada_ars_000053, Target: ara, Predicted: cym
|
| 189 |
+
Key: ara_sada_ars_000022, Target: ara, Predicted: mlg
|
| 190 |
+
Key: ara_sada_ars_000055, Target: ara, Predicted: eng
|
| 191 |
+
Key: ara_sada_ars_000024, Target: ara, Predicted: som
|
| 192 |
+
Key: ara_sada_arb_000005, Target: ara, Predicted: heb
|
| 193 |
+
Key: ara_sada_ars_000058, Target: ara, Predicted: mya
|
| 194 |
+
Key: ara_sada_arb_000041, Target: ara, Predicted: mlg
|
| 195 |
+
Key: ara_sada_ars_000061, Target: ara, Predicted: isl
|
| 196 |
+
Key: ara_sada_ars_000030, Target: ara, Predicted: amh
|
| 197 |
+
Key: ara_sada_ars_000031, Target: ara, Predicted: snd
|
| 198 |
+
Key: ara_sada_ars_000032, Target: ara, Predicted: deu
|
| 199 |
+
Key: ara_sada_ars_000033, Target: ara, Predicted: bos
|
| 200 |
+
Key: ara_sada_ars_000037, Target: ara, Predicted: pus
|
| 201 |
+
Key: ara_sada_ars_000069, Target: ara, Predicted: bre
|
| 202 |
+
Key: ara_sada_ars_000038, Target: ara, Predicted: pol
|
| 203 |
+
Key: deu_swissdial_ag_000014, Target: deu, Predicted: afr
|
| 204 |
+
Key: ara_sada_ars_000072, Target: ara, Predicted: slk
|
| 205 |
+
Key: ara_sada_ars_000136, Target: ara, Predicted: nno
|
| 206 |
+
Key: deu_swissdial_ag_000015, Target: deu, Predicted: afr
|
| 207 |
+
Key: deu_swissdial_ag_000016, Target: deu, Predicted: afr
|
| 208 |
+
Key: deu_swissdial_ag_000017, Target: deu, Predicted: nld
|
| 209 |
+
Key: ara_sada_ars_000075, Target: ara, Predicted: deu
|
| 210 |
+
Key: deu_swissdial_ag_000019, Target: deu, Predicted: yid
|
| 211 |
+
Key: ara_sada_ars_000110, Target: ara, Predicted: pan
|
| 212 |
+
Key: ara_sada_ars_000111, Target: ara, Predicted: sqi
|
| 213 |
+
Key: deu_swissdial_ag_000023, Target: deu, Predicted: afr
|
| 214 |
+
Key: ara_sada_ars_000082, Target: ara, Predicted: fra
|
| 215 |
+
Key: ara_sada_ars_000146, Target: ara, Predicted: tuk
|
| 216 |
+
Key: deu_swissdial_ag_000025, Target: deu, Predicted: afr
|
| 217 |
+
Key: ara_sada_ars_000083, Target: ara, Predicted: pus
|
| 218 |
+
Key: ara_sada_ars_000115, Target: ara, Predicted: tat
|
| 219 |
+
Key: deu_swissdial_ag_000026, Target: deu, Predicted: cym
|
| 220 |
+
Key: deu_swissdial_ag_000027, Target: deu, Predicted: yid
|
| 221 |
+
Key: deu_swissdial_ag_000028, Target: deu, Predicted: nld
|
| 222 |
+
Key: ara_sada_ars_000150, Target: ara, Predicted: isl
|
| 223 |
+
Key: deu_swissdial_ag_000029, Target: deu, Predicted: ces
|
| 224 |
+
Key: ara_sada_ars_000151, Target: ara, Predicted: heb
|
| 225 |
+
Key: ara_sada_ars_000152, Target: ara, Predicted: amh
|
| 226 |
+
Key: deu_swissdial_ag_000031, Target: deu, Predicted: nld
|
| 227 |
+
Key: ara_sada_ars_000089, Target: ara, Predicted: bod
|
| 228 |
+
Key: ara_sada_ars_000121, Target: ara, Predicted: fra
|
| 229 |
+
Key: deu_swissdial_ag_000032, Target: deu, Predicted: afr
|
| 230 |
+
Key: ara_sada_ars_000090, Target: ara, Predicted: mar
|
| 231 |
+
Key: ara_sada_ars_000091, Target: ara, Predicted: fra
|
| 232 |
+
Key: ara_sada_ars_000123, Target: ara, Predicted: isl
|
| 233 |
+
Key: ara_sada_ars_000092, Target: ara, Predicted: cym
|
| 234 |
+
Key: deu_swissdial_ag_000035, Target: deu, Predicted: afr
|
| 235 |
+
Key: deu_swissdial_ag_000004, Target: deu, Predicted: nld
|
| 236 |
+
Key: deu_swissdial_ag_000037, Target: deu, Predicted: gle
|
| 237 |
+
Key: deu_swissdial_ag_000006, Target: deu, Predicted: cym
|
| 238 |
+
Key: deu_swissdial_ag_000038, Target: deu, Predicted: afr
|
| 239 |
+
Key: ara_sada_ars_000129, Target: ara, Predicted: heb
|
| 240 |
+
Key: deu_swissdial_ag_000008, Target: deu, Predicted: tat
|
| 241 |
+
Key: deu_swissdial_ag_000040, Target: deu, Predicted: afr
|
| 242 |
+
Key: ara_sada_ars_000098, Target: ara, Predicted: khm
|
| 243 |
+
Key: ara_sada_ars_000131, Target: ara, Predicted: heb
|
| 244 |
+
Key: deu_swissdial_ag_000042, Target: deu, Predicted: afr
|
| 245 |
+
Key: deu_swissdial_ag_000013, Target: deu, Predicted: afr
|
| 246 |
+
Key: deu_swissdial_ag_000110, Target: deu, Predicted: afr
|
| 247 |
+
Key: deu_swissdial_ag_000047, Target: deu, Predicted: nld
|
| 248 |
+
Key: deu_swissdial_ag_000113, Target: deu, Predicted: gle
|
| 249 |
+
Key: deu_swissdial_ag_000114, Target: deu, Predicted: afr
|
| 250 |
+
Key: deu_swissdial_ag_000115, Target: deu, Predicted: nld
|
| 251 |
+
Key: deu_swissdial_ag_000084, Target: deu, Predicted: nld
|
| 252 |
+
Key: deu_swissdial_ag_000148, Target: deu, Predicted: gle
|
| 253 |
+
Key: deu_swissdial_ag_000085, Target: deu, Predicted: nld
|
| 254 |
+
Key: deu_swissdial_ag_000117, Target: deu, Predicted: afr
|
| 255 |
+
Key: deu_swissdial_ag_000149, Target: deu, Predicted: nld
|
| 256 |
+
Key: deu_swissdial_ag_000086, Target: deu, Predicted: afr
|
| 257 |
+
Key: deu_swissdial_ag_000118, Target: deu, Predicted: afr
|
| 258 |
+
Key: deu_swissdial_ag_000150, Target: deu, Predicted: afr
|
| 259 |
+
Key: deu_swissdial_ag_000055, Target: deu, Predicted: afr
|
| 260 |
+
Key: deu_swissdial_ag_000151, Target: deu, Predicted: afr
|
| 261 |
+
Key: deu_swissdial_ag_000088, Target: deu, Predicted: afr
|
| 262 |
+
Key: deu_swissdial_ag_000120, Target: deu, Predicted: afr
|
| 263 |
+
Key: deu_swissdial_ag_000089, Target: deu, Predicted: gle
|
| 264 |
+
Key: deu_swissdial_ag_000121, Target: deu, Predicted: cym
|
| 265 |
+
Key: deu_swissdial_ag_000058, Target: deu, Predicted: afr
|
| 266 |
+
Key: deu_swissdial_ag_000122, Target: deu, Predicted: gle
|
| 267 |
+
Key: deu_swissdial_ag_000059, Target: deu, Predicted: afr
|
| 268 |
+
Key: deu_swissdial_ag_000091, Target: deu, Predicted: afr
|
| 269 |
+
Key: deu_swissdial_ag_000092, Target: deu, Predicted: afr
|
| 270 |
+
Key: deu_swissdial_ag_000124, Target: deu, Predicted: ces
|
| 271 |
+
Key: deu_swissdial_ag_000093, Target: deu, Predicted: afr
|
| 272 |
+
Key: deu_swissdial_ag_000125, Target: deu, Predicted: nld
|
| 273 |
+
Key: deu_swissdial_ag_000157, Target: deu, Predicted: cym
|
| 274 |
+
Key: deu_swissdial_ag_000126, Target: deu, Predicted: nld
|
| 275 |
+
Key: deu_swissdial_ag_000095, Target: deu, Predicted: nld
|
| 276 |
+
Key: deu_swissdial_ag_000127, Target: deu, Predicted: slv
|
| 277 |
+
Key: deu_swissdial_ag_000159, Target: deu, Predicted: afr
|
| 278 |
+
Key: deu_swissdial_ag_000064, Target: deu, Predicted: afr
|
| 279 |
+
Key: deu_swissdial_ag_000096, Target: deu, Predicted: ces
|
| 280 |
+
Key: deu_swissdial_ag_000128, Target: deu, Predicted: gle
|
| 281 |
+
Key: deu_swissdial_ag_000097, Target: deu, Predicted: afr
|
| 282 |
+
Key: deu_swissdial_ag_000099, Target: deu, Predicted: cym
|
| 283 |
+
Key: deu_swissdial_ag_000163, Target: deu, Predicted: afr
|
| 284 |
+
Key: deu_swissdial_ag_000100, Target: deu, Predicted: nld
|
| 285 |
+
Key: deu_swissdial_ag_000134, Target: deu, Predicted: nld
|
| 286 |
+
Key: deu_swissdial_ag_000103, Target: deu, Predicted: gle
|
| 287 |
+
Key: deu_swissdial_ag_000135, Target: deu, Predicted: cym
|
| 288 |
+
Key: deu_swissdial_ag_000072, Target: deu, Predicted: gle
|
| 289 |
+
Key: deu_swissdial_be_000004, Target: deu, Predicted: afr
|
| 290 |
+
Key: deu_swissdial_ag_000138, Target: deu, Predicted: afr
|
| 291 |
+
Key: deu_swissdial_ag_000107, Target: deu, Predicted: afr
|
| 292 |
+
Key: deu_swissdial_ag_000139, Target: deu, Predicted: gle
|
| 293 |
+
Key: deu_swissdial_ag_000076, Target: deu, Predicted: afr
|
| 294 |
+
Key: deu_swissdial_ag_000108, Target: deu, Predicted: ltz
|
| 295 |
+
Key: deu_swissdial_ag_000140, Target: deu, Predicted: nld
|
| 296 |
+
Key: deu_swissdial_be_000008, Target: deu, Predicted: cym
|
| 297 |
+
Key: deu_swissdial_ag_000077, Target: deu, Predicted: nld
|
| 298 |
+
Key: deu_swissdial_ag_000109, Target: deu, Predicted: gle
|
| 299 |
+
Key: deu_swissdial_ag_000141, Target: deu, Predicted: afr
|
| 300 |
+
Key: deu_swissdial_be_000042, Target: deu, Predicted: isl
|
| 301 |
+
Key: deu_swissdial_be_000043, Target: deu, Predicted: afr
|
| 302 |
+
Key: deu_swissdial_be_000075, Target: deu, Predicted: afr
|
| 303 |
+
Key: deu_swissdial_be_000107, Target: deu, Predicted: afr
|
| 304 |
+
Key: deu_swissdial_be_000044, Target: deu, Predicted: afr
|
| 305 |
+
Key: deu_swissdial_be_000076, Target: deu, Predicted: nld
|
| 306 |
+
Key: deu_swissdial_be_000108, Target: deu, Predicted: ltz
|
| 307 |
+
Key: deu_swissdial_be_000013, Target: deu, Predicted: afr
|
| 308 |
+
Key: deu_swissdial_be_000110, Target: deu, Predicted: afr
|
| 309 |
+
Key: deu_swissdial_be_000016, Target: deu, Predicted: nld
|
| 310 |
+
Key: deu_swissdial_be_000048, Target: deu, Predicted: afr
|
| 311 |
+
Key: deu_swissdial_be_000112, Target: deu, Predicted: afr
|
| 312 |
+
Key: deu_swissdial_be_000049, Target: deu, Predicted: afr
|
| 313 |
+
Key: deu_swissdial_be_000113, Target: deu, Predicted: afr
|
| 314 |
+
Key: deu_swissdial_be_000018, Target: deu, Predicted: afr
|
| 315 |
+
Key: deu_swissdial_be_000082, Target: deu, Predicted: afr
|
| 316 |
+
Key: deu_swissdial_be_000114, Target: deu, Predicted: nld
|
| 317 |
+
Key: deu_swissdial_be_000115, Target: deu, Predicted: afr
|
| 318 |
+
Key: deu_swissdial_be_000084, Target: deu, Predicted: nld
|
| 319 |
+
Key: deu_swissdial_be_000021, Target: deu, Predicted: ltz
|
| 320 |
+
Key: deu_swissdial_be_000022, Target: deu, Predicted: afr
|
| 321 |
+
Key: deu_swissdial_be_000054, Target: deu, Predicted: afr
|
| 322 |
+
Key: deu_swissdial_be_000023, Target: deu, Predicted: afr
|
| 323 |
+
Key: deu_swissdial_be_000087, Target: deu, Predicted: afr
|
| 324 |
+
Key: deu_swissdial_be_000024, Target: deu, Predicted: afr
|
| 325 |
+
Key: deu_swissdial_be_000120, Target: deu, Predicted: afr
|
| 326 |
+
Key: deu_swissdial_be_000025, Target: deu, Predicted: afr
|
| 327 |
+
Key: deu_swissdial_be_000089, Target: deu, Predicted: nld
|
| 328 |
+
Key: deu_swissdial_be_000121, Target: deu, Predicted: cym
|
| 329 |
+
Key: deu_swissdial_be_000059, Target: deu, Predicted: afr
|
| 330 |
+
Key: deu_swissdial_be_000091, Target: deu, Predicted: afr
|
| 331 |
+
Key: deu_swissdial_be_000123, Target: deu, Predicted: afr
|
| 332 |
+
Key: deu_swissdial_be_000060, Target: deu, Predicted: isl
|
| 333 |
+
Key: deu_swissdial_be_000124, Target: deu, Predicted: nld
|
| 334 |
+
Key: deu_swissdial_be_000093, Target: deu, Predicted: afr
|
| 335 |
+
Key: deu_swissdial_be_000125, Target: deu, Predicted: nld
|
| 336 |
+
Key: deu_swissdial_be_000030, Target: deu, Predicted: nld
|
| 337 |
+
Key: deu_swissdial_be_000062, Target: deu, Predicted: afr
|
| 338 |
+
Key: deu_swissdial_be_000031, Target: deu, Predicted: afr
|
| 339 |
+
Key: deu_swissdial_be_000095, Target: deu, Predicted: afr
|
| 340 |
+
Key: deu_swissdial_be_000096, Target: deu, Predicted: afr
|
| 341 |
+
Key: deu_swissdial_be_000033, Target: deu, Predicted: afr
|
| 342 |
+
Key: deu_swissdial_be_000129, Target: deu, Predicted: afr
|
| 343 |
+
Key: deu_swissdial_be_000034, Target: deu, Predicted: afr
|
| 344 |
+
Key: deu_swissdial_be_000066, Target: deu, Predicted: afr
|
| 345 |
+
Key: deu_swissdial_be_000035, Target: deu, Predicted: afr
|
| 346 |
+
Key: deu_swissdial_be_000036, Target: deu, Predicted: slv
|
| 347 |
+
Key: deu_swissdial_be_000039, Target: deu, Predicted: afr
|
| 348 |
+
Key: deu_swissdial_be_000103, Target: deu, Predicted: est
|
| 349 |
+
Key: deu_swissdial_be_000040, Target: deu, Predicted: afr
|
| 350 |
+
Key: deu_swissdial_bs_000005, Target: deu, Predicted: gle
|
| 351 |
+
Key: deu_swissdial_be_000105, Target: deu, Predicted: afr
|
| 352 |
+
Key: deu_swissdial_bs_000103, Target: deu, Predicted: ltz
|
| 353 |
+
Key: deu_swissdial_bs_000082, Target: deu, Predicted: nld
|
| 354 |
+
Key: deu_swissdial_bs_000114, Target: deu, Predicted: cym
|
| 355 |
+
Key: deu_swissdial_bs_000088, Target: deu, Predicted: ltz
|
| 356 |
+
Key: deu_swissdial_bs_000093, Target: deu, Predicted: gle
|
| 357 |
+
Key: deu_swissdial_bs_000031, Target: deu, Predicted: cym
|
| 358 |
+
Key: deu_swissdial_bs_000036, Target: deu, Predicted: ltz
|
| 359 |
+
Key: deu_swissdial_bs_000133, Target: deu, Predicted: nld
|
| 360 |
+
Key: deu_swissdial_gr_000088, Target: deu, Predicted: nld
|
| 361 |
+
Key: deu_swissdial_bs_000139, Target: deu, Predicted: afr
|
| 362 |
+
Key: deu_swissdial_bs_000144, Target: deu, Predicted: ltz
|
| 363 |
+
Key: deu_swissdial_gr_000064, Target: deu, Predicted: slv
|
| 364 |
+
Key: deu_swissdial_gr_000040, Target: deu, Predicted: afr
|
| 365 |
+
Key: deu_swissdial_gr_000105, Target: deu, Predicted: afr
|
| 366 |
+
Key: deu_swissdial_gr_000010, Target: deu, Predicted: afr
|
| 367 |
+
Key: deu_swissdial_gr_000114, Target: deu, Predicted: slv
|
| 368 |
+
Key: deu_swissdial_gr_000116, Target: deu, Predicted: nld
|
| 369 |
+
Key: deu_swissdial_lu_000006, Target: deu, Predicted: nld
|
| 370 |
+
Key: deu_swissdial_lu_000038, Target: deu, Predicted: ltz
|
| 371 |
+
Key: deu_swissdial_lu_000007, Target: deu, Predicted: afr
|
| 372 |
+
Key: deu_swissdial_lu_000071, Target: deu, Predicted: afr
|
| 373 |
+
Key: deu_swissdial_lu_000040, Target: deu, Predicted: ltz
|
| 374 |
+
Key: deu_swissdial_lu_000072, Target: deu, Predicted: afr
|
| 375 |
+
Key: deu_swissdial_lu_000042, Target: deu, Predicted: nld
|
| 376 |
+
Key: deu_swissdial_lu_000011, Target: deu, Predicted: nno
|
| 377 |
+
Key: deu_swissdial_lu_000043, Target: deu, Predicted: ltz
|
| 378 |
+
Key: deu_swissdial_lu_000044, Target: deu, Predicted: afr
|
| 379 |
+
Key: deu_swissdial_lu_000077, Target: deu, Predicted: afr
|
| 380 |
+
Key: deu_swissdial_lu_000014, Target: deu, Predicted: yid
|
| 381 |
+
Key: deu_swissdial_lu_000047, Target: deu, Predicted: ltz
|
| 382 |
+
Key: deu_swissdial_lu_000079, Target: deu, Predicted: afr
|
| 383 |
+
Key: deu_swissdial_lu_000016, Target: deu, Predicted: nld
|
| 384 |
+
Key: deu_swissdial_lu_000017, Target: deu, Predicted: nld
|
| 385 |
+
Key: deu_swissdial_lu_000019, Target: deu, Predicted: nld
|
| 386 |
+
Key: deu_swissdial_lu_000051, Target: deu, Predicted: cym
|
| 387 |
+
Key: deu_swissdial_lu_000052, Target: deu, Predicted: nld
|
| 388 |
+
Key: deu_swissdial_lu_000053, Target: deu, Predicted: afr
|
| 389 |
+
Key: deu_swissdial_lu_000085, Target: deu, Predicted: nld
|
| 390 |
+
Key: deu_swissdial_lu_000054, Target: deu, Predicted: afr
|
| 391 |
+
Key: deu_swissdial_lu_000023, Target: deu, Predicted: afr
|
| 392 |
+
Key: deu_swissdial_lu_000055, Target: deu, Predicted: nld
|
| 393 |
+
Key: deu_swissdial_lu_000087, Target: deu, Predicted: nld
|
| 394 |
+
Key: deu_swissdial_lu_000024, Target: deu, Predicted: ltz
|
| 395 |
+
Key: deu_swissdial_lu_000056, Target: deu, Predicted: afr
|
| 396 |
+
Key: deu_swissdial_lu_000026, Target: deu, Predicted: nld
|
| 397 |
+
Key: deu_swissdial_lu_000058, Target: deu, Predicted: nld
|
| 398 |
+
Key: deu_swissdial_lu_000090, Target: deu, Predicted: nld
|
| 399 |
+
Key: deu_swissdial_lu_000027, Target: deu, Predicted: ltz
|
| 400 |
+
Key: deu_swissdial_lu_000059, Target: deu, Predicted: ltz
|
| 401 |
+
Key: deu_swissdial_lu_000092, Target: deu, Predicted: glv
|
| 402 |
+
Key: deu_swissdial_lu_000029, Target: deu, Predicted: gle
|
| 403 |
+
Key: deu_swissdial_lu_000093, Target: deu, Predicted: cym
|
| 404 |
+
Key: deu_swissdial_lu_000031, Target: deu, Predicted: cym
|
| 405 |
+
Key: deu_swissdial_lu_000095, Target: deu, Predicted: nld
|
| 406 |
+
Key: deu_swissdial_lu_000096, Target: deu, Predicted: afr
|
| 407 |
+
Key: deu_swissdial_lu_000033, Target: deu, Predicted: nld
|
| 408 |
+
Key: deu_swissdial_lu_000065, Target: deu, Predicted: hun
|
| 409 |
+
Key: deu_swissdial_lu_000097, Target: deu, Predicted: nld
|
| 410 |
+
Key: deu_swissdial_lu_000002, Target: deu, Predicted: ltz
|
| 411 |
+
Key: deu_swissdial_lu_000034, Target: deu, Predicted: nld
|
| 412 |
+
Key: deu_swissdial_lu_000066, Target: deu, Predicted: afr
|
| 413 |
+
Key: deu_swissdial_lu_000067, Target: deu, Predicted: afr
|
| 414 |
+
Key: deu_swissdial_lu_000099, Target: deu, Predicted: ltz
|
| 415 |
+
Key: deu_swissdial_lu_000004, Target: deu, Predicted: ltz
|
| 416 |
+
Key: deu_swissdial_lu_000068, Target: deu, Predicted: afr
|
| 417 |
+
Key: deu_swissdial_lu_000069, Target: deu, Predicted: nld
|
| 418 |
+
Key: deu_swissdial_lu_000102, Target: deu, Predicted: nld
|
| 419 |
+
Key: deu_swissdial_lu_000134, Target: deu, Predicted: afr
|
| 420 |
+
Key: deu_swissdial_lu_000166, Target: deu, Predicted: nld
|
| 421 |
+
Key: deu_swissdial_lu_000103, Target: deu, Predicted: nld
|
| 422 |
+
Key: deu_swissdial_lu_000135, Target: deu, Predicted: afr
|
| 423 |
+
Key: deu_swissdial_lu_000104, Target: deu, Predicted: nld
|
| 424 |
+
Key: deu_swissdial_lu_000136, Target: deu, Predicted: nld
|
| 425 |
+
Key: deu_swissdial_lu_000168, Target: deu, Predicted: nld
|
| 426 |
+
Key: deu_swissdial_lu_000137, Target: deu, Predicted: nld
|
| 427 |
+
Key: deu_swissdial_lu_000106, Target: deu, Predicted: afr
|
| 428 |
+
Key: deu_swissdial_lu_000170, Target: deu, Predicted: nld
|
| 429 |
+
Key: deu_swissdial_lu_000108, Target: deu, Predicted: afr
|
| 430 |
+
Key: deu_swissdial_lu_000140, Target: deu, Predicted: gle
|
| 431 |
+
Key: deu_swissdial_lu_000112, Target: deu, Predicted: yid
|
| 432 |
+
Key: deu_swissdial_lu_000144, Target: deu, Predicted: nld
|
| 433 |
+
Key: deu_swissdial_lu_000113, Target: deu, Predicted: nld
|
| 434 |
+
Key: deu_swissdial_lu_000116, Target: deu, Predicted: ltz
|
| 435 |
+
Key: deu_swissdial_lu_000150, Target: deu, Predicted: afr
|
| 436 |
+
Key: deu_swissdial_lu_000151, Target: deu, Predicted: nld
|
| 437 |
+
Key: deu_swissdial_sg_000010, Target: deu, Predicted: nld
|
| 438 |
+
Key: deu_swissdial_lu_000121, Target: deu, Predicted: afr
|
| 439 |
+
Key: deu_swissdial_lu_000123, Target: deu, Predicted: nld
|
| 440 |
+
Key: deu_swissdial_lu_000155, Target: deu, Predicted: nld
|
| 441 |
+
Key: deu_swissdial_lu_000124, Target: deu, Predicted: nld
|
| 442 |
+
Key: deu_swissdial_lu_000157, Target: deu, Predicted: afr
|
| 443 |
+
Key: deu_swissdial_lu_000126, Target: deu, Predicted: afr
|
| 444 |
+
Key: deu_swissdial_lu_000158, Target: deu, Predicted: ltz
|
| 445 |
+
Key: deu_swissdial_lu_000159, Target: deu, Predicted: nld
|
| 446 |
+
Key: deu_swissdial_lu_000130, Target: deu, Predicted: nld
|
| 447 |
+
Key: deu_swissdial_lu_000162, Target: deu, Predicted: nld
|
| 448 |
+
Key: deu_swissdial_lu_000163, Target: deu, Predicted: nld
|
| 449 |
+
Key: deu_swissdial_sg_000021, Target: deu, Predicted: gle
|
| 450 |
+
Key: deu_swissdial_lu_000132, Target: deu, Predicted: nld
|
| 451 |
+
Key: deu_swissdial_lu_000164, Target: deu, Predicted: ltz
|
| 452 |
+
Key: deu_swissdial_sg_000022, Target: deu, Predicted: gle
|
| 453 |
+
Key: deu_swissdial_lu_000133, Target: deu, Predicted: nld
|
| 454 |
+
Key: deu_swissdial_vs_000006, Target: deu, Predicted: afr
|
| 455 |
+
Key: deu_swissdial_vs_000038, Target: deu, Predicted: cym
|
| 456 |
+
Key: deu_swissdial_vs_000042, Target: deu, Predicted: nld
|
| 457 |
+
Key: deu_swissdial_vs_000045, Target: deu, Predicted: nld
|
| 458 |
+
Key: deu_swissdial_vs_000047, Target: deu, Predicted: slv
|
| 459 |
+
Key: deu_swissdial_vs_000016, Target: deu, Predicted: nld
|
| 460 |
+
Key: deu_swissdial_vs_000048, Target: deu, Predicted: afr
|
| 461 |
+
Key: deu_swissdial_sg_000067, Target: deu, Predicted: cym
|
| 462 |
+
Key: deu_swissdial_vs_000053, Target: deu, Predicted: gle
|
| 463 |
+
Key: deu_swissdial_sg_000104, Target: deu, Predicted: est
|
| 464 |
+
Key: deu_swissdial_vs_000059, Target: deu, Predicted: nld
|
| 465 |
+
Key: deu_swissdial_sg_000078, Target: deu, Predicted: nld
|
| 466 |
+
Key: deu_swissdial_vs_000060, Target: deu, Predicted: afr
|
| 467 |
+
Key: deu_swissdial_vs_000030, Target: deu, Predicted: nld
|
| 468 |
+
Key: deu_swissdial_vs_000032, Target: deu, Predicted: afr
|
| 469 |
+
Key: deu_swissdial_vs_000034, Target: deu, Predicted: gle
|
| 470 |
+
Key: deu_swissdial_vs_000003, Target: deu, Predicted: afr
|
| 471 |
+
Key: deu_swissdial_vs_000004, Target: deu, Predicted: afr
|
| 472 |
+
Key: deu_swissdial_vs_000005, Target: deu, Predicted: cym
|
| 473 |
+
Key: deu_swissdial_zh_000026, Target: deu, Predicted: gle
|
| 474 |
+
Key: deu_swissdial_vs_000136, Target: deu, Predicted: afr
|
| 475 |
+
Key: deu_swissdial_vs_000073, Target: deu, Predicted: nld
|
| 476 |
+
Key: deu_swissdial_vs_000137, Target: deu, Predicted: afr
|
| 477 |
+
Key: deu_swissdial_zh_000029, Target: deu, Predicted: gle
|
| 478 |
+
Key: deu_swissdial_zh_000031, Target: deu, Predicted: yid
|
| 479 |
+
Key: deu_swissdial_vs_000076, Target: deu, Predicted: afr
|
| 480 |
+
Key: deu_swissdial_zh_000000, Target: deu, Predicted: nld
|
| 481 |
+
Key: deu_swissdial_zh_000032, Target: deu, Predicted: nld
|
| 482 |
+
Key: deu_swissdial_vs_000078, Target: deu, Predicted: gle
|
| 483 |
+
Key: deu_swissdial_vs_000082, Target: deu, Predicted: dan
|
| 484 |
+
Key: deu_swissdial_vs_000114, Target: deu, Predicted: afr
|
| 485 |
+
Key: deu_swissdial_zh_000006, Target: deu, Predicted: ltz
|
| 486 |
+
Key: deu_swissdial_vs_000083, Target: deu, Predicted: afr
|
| 487 |
+
Key: deu_swissdial_zh_000007, Target: deu, Predicted: afr
|
| 488 |
+
Key: deu_swissdial_zh_000008, Target: deu, Predicted: ltz
|
| 489 |
+
Key: deu_swissdial_zh_000011, Target: deu, Predicted: cym
|
| 490 |
+
Key: deu_swissdial_vs_000088, Target: deu, Predicted: afr
|
| 491 |
+
Key: deu_swissdial_vs_000120, Target: deu, Predicted: afr
|
| 492 |
+
Key: deu_swissdial_zh_000045, Target: deu, Predicted: est
|
| 493 |
+
Key: deu_swissdial_vs_000090, Target: deu, Predicted: afr
|
| 494 |
+
Key: deu_swissdial_vs_000124, Target: deu, Predicted: nld
|
| 495 |
+
Key: deu_swissdial_zh_000048, Target: deu, Predicted: afr
|
| 496 |
+
Key: deu_swissdial_zh_000017, Target: deu, Predicted: afr
|
| 497 |
+
Key: deu_swissdial_vs_000127, Target: deu, Predicted: afr
|
| 498 |
+
Key: deu_swissdial_zh_000020, Target: deu, Predicted: gle
|
| 499 |
+
Key: deu_swissdial_vs_000129, Target: deu, Predicted: gle
|
| 500 |
+
Key: deu_swissdial_zh_000025, Target: deu, Predicted: ltz
|
| 501 |
+
Key: deu_swissdial_zh_000122, Target: deu, Predicted: afr
|
| 502 |
+
Key: deu_swissdial_zh_000059, Target: deu, Predicted: nld
|
| 503 |
+
Key: deu_swissdial_zh_000091, Target: deu, Predicted: ltz
|
| 504 |
+
Key: ell_cretan_cre_000013, Target: ell, Predicted: ukr
|
| 505 |
+
Key: deu_swissdial_zh_000060, Target: deu, Predicted: afr
|
| 506 |
+
Key: deu_swissdial_zh_000124, Target: deu, Predicted: afr
|
| 507 |
+
Key: ell_cretan_cre_000014, Target: ell, Predicted: pus
|
| 508 |
+
Key: deu_swissdial_zh_000094, Target: deu, Predicted: afr
|
| 509 |
+
Key: deu_swissdial_zh_000127, Target: deu, Predicted: nld
|
| 510 |
+
Key: ell_cretan_cre_000017, Target: ell, Predicted: swa
|
| 511 |
+
Key: deu_swissdial_zh_000129, Target: deu, Predicted: nld
|
| 512 |
+
Key: deu_swissdial_zh_000066, Target: deu, Predicted: nld
|
| 513 |
+
Key: deu_swissdial_zh_000098, Target: deu, Predicted: cym
|
| 514 |
+
Key: ell_cretan_cre_000020, Target: ell, Predicted: sqi
|
| 515 |
+
Key: deu_swissdial_zh_000133, Target: deu, Predicted: nld
|
| 516 |
+
Key: deu_swissdial_zh_000134, Target: deu, Predicted: gle
|
| 517 |
+
Key: deu_swissdial_zh_000103, Target: deu, Predicted: afr
|
| 518 |
+
Key: deu_swissdial_zh_000072, Target: deu, Predicted: nld
|
| 519 |
+
Key: deu_swissdial_zh_000105, Target: deu, Predicted: afr
|
| 520 |
+
Key: deu_swissdial_zh_000141, Target: deu, Predicted: gle
|
| 521 |
+
Key: deu_swissdial_zh_000078, Target: deu, Predicted: afr
|
| 522 |
+
Key: deu_swissdial_zh_000080, Target: deu, Predicted: gle
|
| 523 |
+
Key: deu_swissdial_zh_000112, Target: deu, Predicted: ltz
|
| 524 |
+
Key: ell_cretan_cre_000002, Target: ell, Predicted: swa
|
| 525 |
+
Key: ell_cretan_cre_000034, Target: ell, Predicted: bel
|
| 526 |
+
Key: deu_swissdial_zh_000081, Target: deu, Predicted: afr
|
| 527 |
+
Key: ell_cretan_cre_000003, Target: ell, Predicted: mlg
|
| 528 |
+
Key: ell_cretan_cre_000035, Target: ell, Predicted: hrv
|
| 529 |
+
Key: deu_swissdial_zh_000082, Target: deu, Predicted: nld
|
| 530 |
+
Key: ell_cretan_cre_000004, Target: ell, Predicted: mkd
|
| 531 |
+
Key: ell_cretan_cre_000036, Target: ell, Predicted: rus
|
| 532 |
+
Key: ell_cretan_cre_000037, Target: ell, Predicted: bel
|
| 533 |
+
Key: deu_swissdial_zh_000117, Target: deu, Predicted: afr
|
| 534 |
+
Key: ell_cretan_cre_000007, Target: ell, Predicted: srp
|
| 535 |
+
Key: ell_cretan_cre_000039, Target: ell, Predicted: tam
|
| 536 |
+
Key: ell_cretan_cre_000040, Target: ell, Predicted: sqi
|
| 537 |
+
Key: ell_cretan_cre_000041, Target: ell, Predicted: hrv
|
| 538 |
+
Key: deu_swissdial_zh_000120, Target: deu, Predicted: afr
|
| 539 |
+
Key: ell_cretan_cre_000011, Target: ell, Predicted: rus
|
| 540 |
+
Key: ell_cretan_cre_000076, Target: ell, Predicted: mkd
|
| 541 |
+
Key: ell_cretan_cre_000108, Target: ell, Predicted: ron
|
| 542 |
+
Key: ell_cretan_cre_000140, Target: ell, Predicted: hrv
|
| 543 |
+
Key: ell_cretan_cre_000109, Target: ell, Predicted: sqi
|
| 544 |
+
Key: ell_cretan_cre_000110, Target: ell, Predicted: bul
|
| 545 |
+
Key: ell_cretan_cre_000048, Target: ell, Predicted: rus
|
| 546 |
+
Key: ell_cretan_cre_000112, Target: ell, Predicted: lav
|
| 547 |
+
Key: ell_cretan_cre_000081, Target: ell, Predicted: slv
|
| 548 |
+
Key: ell_cretan_cre_000050, Target: ell, Predicted: por
|
| 549 |
+
Key: ell_cretan_cre_000115, Target: ell, Predicted: ita
|
| 550 |
+
Key: ell_cretan_cre_000147, Target: ell, Predicted: swa
|
| 551 |
+
Key: ell_cretan_cre_000052, Target: ell, Predicted: azz
|
| 552 |
+
Key: ell_cretan_cre_000084, Target: ell, Predicted: rus
|
| 553 |
+
Key: ell_cretan_cre_000053, Target: ell, Predicted: luo
|
| 554 |
+
Key: ell_cretan_cre_000117, Target: ell, Predicted: bel
|
| 555 |
+
Key: ell_cretan_cre_000149, Target: ell, Predicted: ita
|
| 556 |
+
Key: ell_cretan_cre_000054, Target: ell, Predicted: ita
|
| 557 |
+
Key: ell_cretan_cre_000150, Target: ell, Predicted: sqi
|
| 558 |
+
Key: ell_cretan_cre_000055, Target: ell, Predicted: ron
|
| 559 |
+
Key: ell_cretan_cre_000088, Target: ell, Predicted: srp
|
| 560 |
+
Key: ell_cretan_cre_000120, Target: ell, Predicted: por
|
| 561 |
+
Key: ell_cretan_cre_000153, Target: ell, Predicted: mkd
|
| 562 |
+
Key: ell_cretan_cre_000090, Target: ell, Predicted: ita
|
| 563 |
+
Key: ell_cretan_cre_000059, Target: ell, Predicted: mlg
|
| 564 |
+
Key: ell_cretan_cre_000091, Target: ell, Predicted: swa
|
| 565 |
+
Key: ell_cretan_cre_000155, Target: ell, Predicted: hrv
|
| 566 |
+
Key: ell_cretan_cre_000092, Target: ell, Predicted: ukr
|
| 567 |
+
Key: ell_cretan_cre_000156, Target: ell, Predicted: lao
|
| 568 |
+
Key: ell_cretan_cre_000157, Target: ell, Predicted: ita
|
| 569 |
+
Key: ell_cretan_cre_000094, Target: ell, Predicted: grn
|
| 570 |
+
Key: ell_cretan_cre_000126, Target: ell, Predicted: azz
|
| 571 |
+
Key: ell_cretan_cre_000064, Target: ell, Predicted: swa
|
| 572 |
+
Key: ell_cretan_cre_000128, Target: ell, Predicted: ita
|
| 573 |
+
Key: ell_cretan_cre_000160, Target: ell, Predicted: hrv
|
| 574 |
+
Key: ell_cretan_cre_000065, Target: ell, Predicted: hrv
|
| 575 |
+
Key: ell_cretan_cre_000097, Target: ell, Predicted: rus
|
| 576 |
+
Key: ell_cretan_cre_000161, Target: ell, Predicted: sqi
|
| 577 |
+
Key: ell_cretan_cre_000066, Target: ell, Predicted: ukr
|
| 578 |
+
Key: ell_cretan_cre_000098, Target: ell, Predicted: pus
|
| 579 |
+
Key: ell_cretan_cre_000067, Target: ell, Predicted: pus
|
| 580 |
+
Key: ell_cretan_cre_000099, Target: ell, Predicted: xty
|
| 581 |
+
Key: ell_cretan_cre_000163, Target: ell, Predicted: swa
|
| 582 |
+
Key: ell_cretan_cre_000101, Target: ell, Predicted: grn
|
| 583 |
+
Key: ell_cretan_cre_000071, Target: ell, Predicted: guj
|
| 584 |
+
Key: ell_cretan_cre_000103, Target: ell, Predicted: ina
|
| 585 |
+
Key: ell_cretan_cre_000167, Target: ell, Predicted: mkd
|
| 586 |
+
Key: ell_cretan_cre_000168, Target: ell, Predicted: ita
|
| 587 |
+
Key: ell_cretan_cre_000073, Target: ell, Predicted: sqi
|
| 588 |
+
Key: ell_cretan_cre_000170, Target: ell, Predicted: sot
|
| 589 |
+
Key: ell_cretan_cre_000107, Target: ell, Predicted: ukr
|
| 590 |
+
Key: ell_cretan_cre_000139, Target: ell, Predicted: rus
|
| 591 |
+
Key: ell_cretan_cre_000171, Target: ell, Predicted: hrv
|
| 592 |
+
Key: ell_cretan_cre_000172, Target: ell, Predicted: mkd
|
| 593 |
+
Key: ell_cretan_cre_000268, Target: ell, Predicted: pol
|
| 594 |
+
Key: ell_cretan_cre_000237, Target: ell, Predicted: sqi
|
| 595 |
+
Key: ell_cretan_cre_000174, Target: ell, Predicted: bel
|
| 596 |
+
Key: ell_cretan_cre_000206, Target: ell, Predicted: aze
|
| 597 |
+
Key: ell_cretan_cre_000270, Target: ell, Predicted: ron
|
| 598 |
+
Key: ell_cretan_cre_000239, Target: ell, Predicted: ukr
|
| 599 |
+
Key: ell_cretan_cre_000240, Target: ell, Predicted: ron
|
| 600 |
+
Key: ell_cretan_cre_000177, Target: ell, Predicted: lit
|
| 601 |
+
Key: ell_cretan_cre_000209, Target: ell, Predicted: azz
|
| 602 |
+
Key: ell_cretan_cre_000241, Target: ell, Predicted: lit
|
| 603 |
+
Key: ell_cretan_cre_000242, Target: ell, Predicted: abk
|
| 604 |
+
Key: ell_cretan_cre_000274, Target: ell, Predicted: slv
|
| 605 |
+
Key: ell_cretan_cre_000275, Target: ell, Predicted: ukr
|
| 606 |
+
Key: ell_cretan_cre_000181, Target: ell, Predicted: azz
|
| 607 |
+
Key: ell_cretan_cre_000245, Target: ell, Predicted: por
|
| 608 |
+
Key: ell_cretan_cre_000248, Target: ell, Predicted: tuk
|
| 609 |
+
Key: ell_cretan_cre_000185, Target: ell, Predicted: guj
|
| 610 |
+
Key: ell_cretan_cre_000249, Target: ell, Predicted: guj
|
| 611 |
+
Key: ell_cretan_cre_000250, Target: ell, Predicted: ron
|
| 612 |
+
Key: ell_cretan_cre_000282, Target: ell, Predicted: ron
|
| 613 |
+
Key: ell_cretan_cre_000187, Target: ell, Predicted: por
|
| 614 |
+
Key: ell_cretan_cre_000219, Target: ell, Predicted: nep
|
| 615 |
+
Key: ell_cretan_cre_000251, Target: ell, Predicted: sqi
|
| 616 |
+
Key: ell_cretan_cre_000252, Target: ell, Predicted: pol
|
| 617 |
+
Key: ell_cretan_cre_000189, Target: ell, Predicted: rus
|
| 618 |
+
Key: ell_cretan_cre_000285, Target: ell, Predicted: bel
|
| 619 |
+
Key: ell_cretan_cre_000286, Target: ell, Predicted: grn
|
| 620 |
+
Key: ell_cretan_cre_000288, Target: ell, Predicted: mlg
|
| 621 |
+
Key: ell_cretan_cre_000225, Target: ell, Predicted: sot
|
| 622 |
+
Key: ell_cretan_cre_000226, Target: ell, Predicted: ita
|
| 623 |
+
Key: ell_cretan_cre_000258, Target: ell, Predicted: lit
|
| 624 |
+
Key: ell_cretan_cre_000195, Target: ell, Predicted: por
|
| 625 |
+
Key: ell_cretan_cre_000227, Target: ell, Predicted: xho
|
| 626 |
+
Key: ell_cretan_cre_000259, Target: ell, Predicted: swa
|
| 627 |
+
Key: ell_cretan_cre_000228, Target: ell, Predicted: bel
|
| 628 |
+
Key: ell_cretan_cre_000197, Target: ell, Predicted: por
|
| 629 |
+
Key: ell_cretan_cre_000261, Target: ell, Predicted: ben
|
| 630 |
+
Key: ell_cretan_cre_000198, Target: ell, Predicted: hrv
|
| 631 |
+
Key: ell_cretan_cre_000230, Target: ell, Predicted: sna
|
| 632 |
+
Key: ell_messenian_mes_000005, Target: ell, Predicted: heb
|
| 633 |
+
Key: ell_cretan_cre_000200, Target: ell, Predicted: swa
|
| 634 |
+
Key: ell_cretan_cre_000232, Target: ell, Predicted: hrv
|
| 635 |
+
Key: ell_cretan_cre_000264, Target: ell, Predicted: sun
|
| 636 |
+
Key: ell_cretan_cre_000265, Target: ell, Predicted: ces
|
| 637 |
+
Key: ell_cretan_cre_000202, Target: ell, Predicted: mkd
|
| 638 |
+
Key: ell_cretan_cre_000266, Target: ell, Predicted: ukr
|
| 639 |
+
Key: ell_cretan_cre_000203, Target: ell, Predicted: mkd
|
| 640 |
+
Key: ell_cretan_cre_000235, Target: ell, Predicted: mkd
|
| 641 |
+
Key: ell_cretan_cre_000267, Target: ell, Predicted: azz
|
| 642 |
+
Key: ell_messenian_mes_000009, Target: ell, Predicted: cym
|
| 643 |
+
Key: ell_messenian_mes_000011, Target: ell, Predicted: cat
|
| 644 |
+
Key: ell_messenian_mes_000043, Target: ell, Predicted: hrv
|
| 645 |
+
Key: ell_messenian_mes_000077, Target: ell, Predicted: ces
|
| 646 |
+
Key: ell_messenian_mes_000079, Target: ell, Predicted: cym
|
| 647 |
+
Key: ell_messenian_mes_000112, Target: ell, Predicted: mkd
|
| 648 |
+
Key: ell_messenian_mes_000085, Target: ell, Predicted: cym
|
| 649 |
+
Key: ell_messenian_mes_000087, Target: ell, Predicted: heb
|
| 650 |
+
Key: ell_messenian_mes_000056, Target: ell, Predicted: nno
|
| 651 |
+
Key: ell_messenian_mes_000089, Target: ell, Predicted: sqi
|
| 652 |
+
Key: ell_messenian_mes_000062, Target: ell, Predicted: cym
|
| 653 |
+
Key: ell_messenian_mes_000099, Target: ell, Predicted: bul
|
| 654 |
+
Key: ell_messenian_mes_000136, Target: ell, Predicted: cym
|
| 655 |
+
Key: ell_messenian_mes_000139, Target: ell, Predicted: hrv
|
| 656 |
+
Key: ell_messenian_mes_000141, Target: ell, Predicted: mkd
|
| 657 |
+
Key: ell_messenian_mes_000143, Target: ell, Predicted: slv
|
| 658 |
+
Key: eng_globe_aus_000018, Target: eng, Predicted: sqi
|
| 659 |
+
Key: ell_messenian_mes_000155, Target: ell, Predicted: nno
|
| 660 |
+
Key: ell_messenian_mes_000156, Target: ell, Predicted: heb
|
| 661 |
+
Key: eng_globe_aus_000000, Target: eng, Predicted: gle
|
| 662 |
+
Key: ell_messenian_mes_000161, Target: ell, Predicted: ces
|
| 663 |
+
Key: ell_messenian_mes_000164, Target: ell, Predicted: cym
|
| 664 |
+
Key: eng_globe_aus_000143, Target: eng, Predicted: tam
|
| 665 |
+
Key: eng_globe_aus_000082, Target: eng, Predicted: sqi
|
| 666 |
+
Key: eng_globe_aus_000118, Target: eng, Predicted: tgl
|
| 667 |
+
Key: eng_globe_bre_000034, Target: eng, Predicted: hun
|
| 668 |
+
Key: eng_globe_bre_000100, Target: eng, Predicted: cym
|
| 669 |
+
Key: eng_globe_bre_000133, Target: eng, Predicted: nor
|
| 670 |
+
Key: eng_globe_bre_000116, Target: eng, Predicted: gle
|
| 671 |
+
Key: eng_globe_bre_000124, Target: eng, Predicted: azz
|
| 672 |
+
Key: eng_globe_bre_000130, Target: eng, Predicted: cym
|
| 673 |
+
Key: eng_globe_bre_000099, Target: eng, Predicted: deu
|
| 674 |
+
Key: eng_globe_can_000087, Target: eng, Predicted: deu
|
| 675 |
+
Key: eng_globe_can_000063, Target: eng, Predicted: kor
|
| 676 |
+
Key: eng_globe_can_000098, Target: eng, Predicted: glv
|
| 677 |
+
Key: eng_globe_fil_000016, Target: eng, Predicted: ces
|
| 678 |
+
Key: eng_globe_fil_000000, Target: eng, Predicted: gle
|
| 679 |
+
Key: eng_globe_fil_000070, Target: eng, Predicted: nld
|
| 680 |
+
Key: eng_globe_fil_000008, Target: eng, Predicted: glv
|
| 681 |
+
Key: eng_globe_gle_000007, Target: eng, Predicted: cym
|
| 682 |
+
Key: eng_globe_fil_000146, Target: eng, Predicted: tgl
|
| 683 |
+
Key: eng_globe_gle_000030, Target: eng, Predicted: snd
|
| 684 |
+
Key: eng_globe_gle_000069, Target: eng, Predicted: gle
|
| 685 |
+
Key: eng_globe_gle_000104, Target: eng, Predicted: gle
|
| 686 |
+
Key: eng_globe_gle_000137, Target: eng, Predicted: gle
|
| 687 |
+
Key: eng_globe_gle_000055, Target: eng, Predicted: gle
|
| 688 |
+
Key: eng_globe_gle_000087, Target: eng, Predicted: cym
|
| 689 |
+
Key: eng_globe_gle_000154, Target: eng, Predicted: tel
|
| 690 |
+
Key: eng_globe_gle_000126, Target: eng, Predicted: gle
|
| 691 |
+
Key: eng_globe_gle_000167, Target: eng, Predicted: hin
|
| 692 |
+
Key: eng_globe_nze_000066, Target: eng, Predicted: mri
|
| 693 |
+
Key: eng_globe_nze_000108, Target: eng, Predicted: urd
|
| 694 |
+
Key: eng_globe_sae_000008, Target: eng, Predicted: ben
|
| 695 |
+
Key: eng_globe_sae_000011, Target: eng, Predicted: tam
|
| 696 |
+
Key: eng_globe_sae_000015, Target: eng, Predicted: urd
|
| 697 |
+
Key: eng_globe_sae_000062, Target: eng, Predicted: ben
|
| 698 |
+
Key: eng_globe_sae_000063, Target: eng, Predicted: xho
|
| 699 |
+
Key: eng_globe_sae_000033, Target: eng, Predicted: tel
|
| 700 |
+
Key: eng_globe_sae_000067, Target: eng, Predicted: tgl
|
| 701 |
+
Key: eng_globe_sae_000168, Target: eng, Predicted: cym
|
| 702 |
+
Key: eng_globe_sae_000169, Target: eng, Predicted: tam
|
| 703 |
+
Key: eng_globe_sae_000140, Target: eng, Predicted: nep
|
| 704 |
+
Key: eng_globe_sae_000109, Target: eng, Predicted: glv
|
| 705 |
+
Key: eng_globe_sae_000143, Target: eng, Predicted: gle
|
| 706 |
+
Key: eng_globe_sae_000115, Target: eng, Predicted: tel
|
| 707 |
+
Key: eng_globe_sco_000009, Target: eng, Predicted: cym
|
| 708 |
+
Key: eng_globe_sae_000160, Target: eng, Predicted: ltz
|
| 709 |
+
Key: eng_globe_sae_000161, Target: eng, Predicted: tam
|
| 710 |
+
Key: eng_globe_sae_000165, Target: eng, Predicted: deu
|
| 711 |
+
Key: eng_globe_sco_000093, Target: eng, Predicted: glv
|
| 712 |
+
Key: eng_globe_sco_000062, Target: eng, Predicted: cym
|
| 713 |
+
Key: eng_globe_sco_000140, Target: eng, Predicted: nld
|
| 714 |
+
Key: eng_globe_sco_000078, Target: eng, Predicted: cym
|
| 715 |
+
Key: eng_globe_sco_000110, Target: eng, Predicted: gle
|
| 716 |
+
Key: eng_globe_sco_000047, Target: eng, Predicted: cym
|
| 717 |
+
Key: eng_globe_sco_000144, Target: eng, Predicted: gle
|
| 718 |
+
Key: eng_globe_sco_000053, Target: eng, Predicted: gle
|
| 719 |
+
Key: eng_globe_use_000098, Target: eng, Predicted: nep
|
| 720 |
+
Key: eng_globe_use_000038, Target: eng, Predicted: mri
|
| 721 |
+
Key: eng_globe_use_000102, Target: eng, Predicted: deu
|
| 722 |
+
Key: eng_globe_use_000044, Target: eng, Predicted: ltz
|
| 723 |
+
Key: eng_globe_use_000076, Target: eng, Predicted: sot
|
| 724 |
+
Key: eng_globe_use_000113, Target: eng, Predicted: glv
|
| 725 |
+
Key: eng_globe_use_000115, Target: eng, Predicted: oci
|
| 726 |
+
Key: eng_globe_use_000020, Target: eng, Predicted: cym
|
| 727 |
+
Key: eng_globe_use_000023, Target: eng, Predicted: msa
|
| 728 |
+
Key: eng_globe_use_000090, Target: eng, Predicted: cym
|
| 729 |
+
Key: eng_globe_use_000059, Target: eng, Predicted: deu
|
| 730 |
+
Key: eng_l2arctic_ara_000029, Target: eng, Predicted: sqi
|
| 731 |
+
Key: eng_globe_use_000176, Target: eng, Predicted: cym
|
| 732 |
+
Key: eng_l2arctic_ara_000002, Target: eng, Predicted: ara
|
| 733 |
+
Key: eng_l2arctic_ara_000003, Target: eng, Predicted: ara
|
| 734 |
+
Key: eng_l2arctic_ara_000069, Target: eng, Predicted: ara
|
| 735 |
+
Key: eng_l2arctic_ara_000038, Target: eng, Predicted: ara
|
| 736 |
+
Key: eng_l2arctic_ara_000070, Target: eng, Predicted: fao
|
| 737 |
+
Key: eng_l2arctic_ara_000040, Target: eng, Predicted: glv
|
| 738 |
+
Key: eng_l2arctic_ara_000077, Target: eng, Predicted: ces
|
| 739 |
+
Key: eng_l2arctic_ara_000046, Target: eng, Predicted: ara
|
| 740 |
+
Key: eng_l2arctic_ara_000079, Target: eng, Predicted: fra
|
| 741 |
+
Key: eng_l2arctic_ara_000080, Target: eng, Predicted: nld
|
| 742 |
+
Key: eng_l2arctic_ara_000146, Target: eng, Predicted: pus
|
| 743 |
+
Key: eng_l2arctic_cmn_000011, Target: eng, Predicted: cmn
|
| 744 |
+
Key: eng_l2arctic_ara_000122, Target: eng, Predicted: hye
|
| 745 |
+
Key: eng_l2arctic_ara_000123, Target: eng, Predicted: kat
|
| 746 |
+
Key: eng_l2arctic_ara_000127, Target: eng, Predicted: ara
|
| 747 |
+
Key: eng_l2arctic_ara_000128, Target: eng, Predicted: som
|
| 748 |
+
Key: eng_l2arctic_ara_000160, Target: eng, Predicted: deu
|
| 749 |
+
Key: eng_l2arctic_ara_000129, Target: eng, Predicted: ara
|
| 750 |
+
Key: eng_l2arctic_ara_000130, Target: eng, Predicted: tgk
|
| 751 |
+
Key: eng_l2arctic_ara_000164, Target: eng, Predicted: ara
|
| 752 |
+
Key: eng_l2arctic_ara_000133, Target: eng, Predicted: nld
|
| 753 |
+
Key: eng_l2arctic_ara_000135, Target: eng, Predicted: hun
|
| 754 |
+
Key: eng_l2arctic_ara_000169, Target: eng, Predicted: ara
|
| 755 |
+
Key: eng_l2arctic_ara_000108, Target: eng, Predicted: ara
|
| 756 |
+
Key: eng_l2arctic_ara_000140, Target: eng, Predicted: ara
|
| 757 |
+
Key: eng_l2arctic_ara_000109, Target: eng, Predicted: ckb
|
| 758 |
+
Key: eng_l2arctic_cmn_000135, Target: eng, Predicted: cmn
|
| 759 |
+
Key: eng_l2arctic_cmn_000138, Target: eng, Predicted: cmn
|
| 760 |
+
Key: eng_l2arctic_cmn_000107, Target: eng, Predicted: cmn
|
| 761 |
+
Key: eng_l2arctic_cmn_000139, Target: eng, Predicted: cmn
|
| 762 |
+
Key: eng_l2arctic_cmn_000108, Target: eng, Predicted: cmn
|
| 763 |
+
Key: eng_l2arctic_cmn_000141, Target: eng, Predicted: lao
|
| 764 |
+
Key: eng_l2arctic_cmn_000078, Target: eng, Predicted: cmn
|
| 765 |
+
Key: eng_l2arctic_cmn_000047, Target: eng, Predicted: ron
|
| 766 |
+
Key: eng_l2arctic_cmn_000113, Target: eng, Predicted: cmn
|
| 767 |
+
Key: eng_l2arctic_hin_000001, Target: eng, Predicted: hin
|
| 768 |
+
Key: eng_l2arctic_cmn_000118, Target: eng, Predicted: por
|
| 769 |
+
Key: eng_l2arctic_cmn_000125, Target: eng, Predicted: mya
|
| 770 |
+
Key: eng_l2arctic_hin_000010, Target: eng, Predicted: hin
|
| 771 |
+
Key: eng_l2arctic_cmn_000094, Target: eng, Predicted: hun
|
| 772 |
+
Key: eng_l2arctic_cmn_000126, Target: eng, Predicted: cmn
|
| 773 |
+
Key: eng_l2arctic_cmn_000097, Target: eng, Predicted: bod
|
| 774 |
+
Key: eng_l2arctic_cmn_000100, Target: eng, Predicted: cmn
|
| 775 |
+
Key: eng_l2arctic_cmn_000133, Target: eng, Predicted: fas
|
| 776 |
+
Key: eng_l2arctic_hin_000119, Target: eng, Predicted: tam
|
| 777 |
+
Key: eng_l2arctic_hin_000025, Target: eng, Predicted: urd
|
| 778 |
+
Key: eng_l2arctic_hin_000122, Target: eng, Predicted: pus
|
| 779 |
+
Key: eng_l2arctic_hin_000123, Target: eng, Predicted: kan
|
| 780 |
+
Key: eng_l2arctic_hin_000061, Target: eng, Predicted: mar
|
| 781 |
+
Key: eng_l2arctic_hin_000126, Target: eng, Predicted: ben
|
| 782 |
+
Key: eng_l2arctic_hin_000127, Target: eng, Predicted: tam
|
| 783 |
+
Key: eng_l2arctic_hin_000096, Target: eng, Predicted: hin
|
| 784 |
+
Key: eng_l2arctic_hin_000128, Target: eng, Predicted: tam
|
| 785 |
+
Key: eng_l2arctic_hin_000131, Target: eng, Predicted: tel
|
| 786 |
+
Key: eng_l2arctic_hin_000132, Target: eng, Predicted: tam
|
| 787 |
+
Key: eng_l2arctic_hin_000069, Target: eng, Predicted: deu
|
| 788 |
+
Key: eng_l2arctic_hin_000101, Target: eng, Predicted: hin
|
| 789 |
+
Key: eng_l2arctic_hin_000133, Target: eng, Predicted: kan
|
| 790 |
+
Key: eng_l2arctic_hin_000102, Target: eng, Predicted: pan
|
| 791 |
+
Key: eng_l2arctic_hin_000104, Target: eng, Predicted: tel
|
| 792 |
+
Key: eng_l2arctic_hin_000106, Target: eng, Predicted: mal
|
| 793 |
+
Key: eng_l2arctic_hin_000108, Target: eng, Predicted: tam
|
| 794 |
+
Key: eng_l2arctic_hin_000140, Target: eng, Predicted: tam
|
| 795 |
+
Key: eng_l2arctic_hin_000141, Target: eng, Predicted: mar
|
| 796 |
+
Key: eng_l2arctic_hin_000110, Target: eng, Predicted: tam
|
| 797 |
+
Key: eng_l2arctic_hin_000144, Target: eng, Predicted: tam
|
| 798 |
+
Key: eng_l2arctic_hin_000147, Target: eng, Predicted: tam
|
| 799 |
+
Key: eng_l2arctic_hin_000149, Target: eng, Predicted: guj
|
| 800 |
+
Key: eng_l2arctic_hin_000152, Target: eng, Predicted: tam
|
| 801 |
+
Key: eng_l2arctic_hin_000154, Target: eng, Predicted: tel
|
| 802 |
+
Key: eng_l2arctic_hin_000188, Target: eng, Predicted: tel
|
| 803 |
+
Key: eng_l2arctic_hin_000157, Target: eng, Predicted: mal
|
| 804 |
+
Key: eng_l2arctic_hin_000159, Target: eng, Predicted: tam
|
| 805 |
+
Key: eng_l2arctic_hin_000191, Target: eng, Predicted: ben
|
| 806 |
+
Key: eng_l2arctic_hin_000193, Target: eng, Predicted: tam
|
| 807 |
+
Key: eng_l2arctic_hin_000163, Target: eng, Predicted: tam
|
| 808 |
+
Key: eng_l2arctic_hin_000195, Target: eng, Predicted: tam
|
| 809 |
+
Key: eng_l2arctic_hin_000164, Target: eng, Predicted: nep
|
| 810 |
+
Key: eng_l2arctic_hin_000196, Target: eng, Predicted: guj
|
| 811 |
+
Key: eng_l2arctic_hin_000166, Target: eng, Predicted: slv
|
| 812 |
+
Key: eng_l2arctic_kor_000022, Target: eng, Predicted: ltz
|
| 813 |
+
Key: eng_l2arctic_hin_000167, Target: eng, Predicted: tam
|
| 814 |
+
Key: eng_l2arctic_hin_000199, Target: eng, Predicted: urd
|
| 815 |
+
Key: eng_l2arctic_hin_000169, Target: eng, Predicted: cym
|
| 816 |
+
Key: eng_l2arctic_hin_000203, Target: eng, Predicted: pan
|
| 817 |
+
Key: eng_l2arctic_hin_000205, Target: eng, Predicted: tel
|
| 818 |
+
Key: eng_l2arctic_hin_000175, Target: eng, Predicted: pan
|
| 819 |
+
Key: eng_l2arctic_kor_000069, Target: eng, Predicted: dan
|
| 820 |
+
Key: eng_l2arctic_kor_000165, Target: eng, Predicted: slv
|
| 821 |
+
Key: eng_l2arctic_kor_000142, Target: eng, Predicted: xho
|
| 822 |
+
Key: eng_l2arctic_kor_000118, Target: eng, Predicted: bod
|
| 823 |
+
Key: eng_l2arctic_kor_000157, Target: eng, Predicted: hun
|
| 824 |
+
Key: eng_l2arctic_spa_000014, Target: eng, Predicted: heb
|
| 825 |
+
Key: eng_l2arctic_spa_000067, Target: eng, Predicted: ita
|
| 826 |
+
Key: eng_l2arctic_spa_000036, Target: eng, Predicted: azz
|
| 827 |
+
Key: eng_l2arctic_spa_000102, Target: eng, Predicted: ron
|
| 828 |
+
Key: eng_l2arctic_spa_000149, Target: eng, Predicted: fin
|
| 829 |
+
Key: eng_l2arctic_vie_000034, Target: eng, Predicted: deu
|
| 830 |
+
Key: eng_l2arctic_vie_000004, Target: eng, Predicted: tgl
|
| 831 |
+
Key: eng_l2arctic_vie_000069, Target: eng, Predicted: mri
|
| 832 |
+
Key: eng_l2arctic_vie_000106, Target: eng, Predicted: lao
|
| 833 |
+
Key: eng_l2arctic_vie_000044, Target: eng, Predicted: xho
|
| 834 |
+
Key: eng_l2arctic_vie_000109, Target: eng, Predicted: cym
|
| 835 |
+
Key: eng_l2arctic_vie_000047, Target: eng, Predicted: xho
|
| 836 |
+
Key: eng_l2arctic_vie_000016, Target: eng, Predicted: lat
|
| 837 |
+
Key: eng_l2arctic_vie_000022, Target: eng, Predicted: pus
|
| 838 |
+
Key: eng_openslr83_nor_000017, Target: eng, Predicted: gle
|
| 839 |
+
Key: eng_openslr83_mid_000076, Target: eng, Predicted: glv
|
| 840 |
+
Key: eng_openslr83_mid_000078, Target: eng, Predicted: cym
|
| 841 |
+
Key: eng_openslr83_nor_000031, Target: eng, Predicted: cym
|
| 842 |
+
Key: eng_openslr83_nor_000069, Target: eng, Predicted: cym
|
| 843 |
+
Key: eng_openslr83_nor_000070, Target: eng, Predicted: cym
|
| 844 |
+
Key: eng_openslr83_nor_000074, Target: eng, Predicted: cym
|
| 845 |
+
Key: eng_openslr83_nor_000050, Target: eng, Predicted: cym
|
| 846 |
+
Key: eng_openslr83_nor_000083, Target: eng, Predicted: gle
|
| 847 |
+
Key: eng_openslr83_nor_000092, Target: eng, Predicted: cym
|
| 848 |
+
Key: eng_openslr83_sou_000065, Target: eng, Predicted: cym
|
| 849 |
+
Key: eng_openslr83_sco_000086, Target: eng, Predicted: gle
|
| 850 |
+
Key: eng_openslr83_sco_000091, Target: eng, Predicted: gle
|
| 851 |
+
Key: eng_openslr83_wel_000002, Target: eng, Predicted: cym
|
| 852 |
+
Key: eng_openslr83_wel_000003, Target: eng, Predicted: cym
|
| 853 |
+
Key: eng_openslr83_wel_000035, Target: eng, Predicted: cym
|
| 854 |
+
Key: eng_openslr83_wel_000067, Target: eng, Predicted: cym
|
| 855 |
+
Key: eng_openslr83_wel_000068, Target: eng, Predicted: cym
|
| 856 |
+
Key: eng_openslr83_wel_000037, Target: eng, Predicted: cym
|
| 857 |
+
Key: eng_openslr83_wel_000069, Target: eng, Predicted: cym
|
| 858 |
+
Key: eng_openslr83_wel_000006, Target: eng, Predicted: cym
|
| 859 |
+
Key: eng_openslr83_wel_000038, Target: eng, Predicted: cym
|
| 860 |
+
Key: eng_openslr83_wel_000040, Target: eng, Predicted: dan
|
| 861 |
+
Key: eng_openslr83_wel_000072, Target: eng, Predicted: cym
|
| 862 |
+
Key: eng_openslr83_wel_000009, Target: eng, Predicted: cym
|
| 863 |
+
Key: eng_openslr83_wel_000073, Target: eng, Predicted: gle
|
| 864 |
+
Key: eng_openslr83_wel_000010, Target: eng, Predicted: cym
|
| 865 |
+
Key: eng_openslr83_wel_000042, Target: eng, Predicted: cym
|
| 866 |
+
Key: eng_openslr83_wel_000074, Target: eng, Predicted: cym
|
| 867 |
+
Key: eng_openslr83_wel_000011, Target: eng, Predicted: cym
|
| 868 |
+
Key: eng_openslr83_wel_000075, Target: eng, Predicted: cym
|
| 869 |
+
Key: eng_openslr83_wel_000012, Target: eng, Predicted: cym
|
| 870 |
+
Key: eng_openslr83_wel_000076, Target: eng, Predicted: cym
|
| 871 |
+
Key: eng_openslr83_wel_000013, Target: eng, Predicted: cym
|
| 872 |
+
Key: eng_openslr83_wel_000077, Target: eng, Predicted: cym
|
| 873 |
+
Key: eng_openslr83_wel_000014, Target: eng, Predicted: cym
|
| 874 |
+
Key: eng_openslr83_wel_000046, Target: eng, Predicted: cym
|
| 875 |
+
Key: eng_openslr83_wel_000015, Target: eng, Predicted: cym
|
| 876 |
+
Key: eng_openslr83_wel_000079, Target: eng, Predicted: cym
|
| 877 |
+
Key: eng_openslr83_wel_000080, Target: eng, Predicted: cym
|
| 878 |
+
Key: eng_openslr83_wel_000081, Target: eng, Predicted: cym
|
| 879 |
+
Key: eng_openslr83_wel_000082, Target: eng, Predicted: cym
|
| 880 |
+
Key: eng_openslr83_wel_000052, Target: eng, Predicted: cym
|
| 881 |
+
Key: eng_openslr83_wel_000084, Target: eng, Predicted: cym
|
| 882 |
+
Key: eng_openslr83_wel_000053, Target: eng, Predicted: cym
|
| 883 |
+
Key: eng_openslr83_wel_000085, Target: eng, Predicted: cym
|
| 884 |
+
Key: eng_openslr83_wel_000022, Target: eng, Predicted: cym
|
| 885 |
+
Key: eng_openslr83_wel_000086, Target: eng, Predicted: cym
|
| 886 |
+
Key: eng_openslr83_wel_000087, Target: eng, Predicted: cym
|
| 887 |
+
Key: eng_openslr83_wel_000056, Target: eng, Predicted: cym
|
| 888 |
+
Key: eng_openslr83_wel_000088, Target: eng, Predicted: cym
|
| 889 |
+
Key: eng_openslr83_wel_000057, Target: eng, Predicted: cym
|
| 890 |
+
Key: eng_openslr83_wel_000026, Target: eng, Predicted: cym
|
| 891 |
+
Key: eng_openslr83_wel_000058, Target: eng, Predicted: cym
|
| 892 |
+
Key: eng_openslr83_wel_000090, Target: eng, Predicted: cym
|
| 893 |
+
Key: eng_openslr83_wel_000060, Target: eng, Predicted: cym
|
| 894 |
+
Key: eng_openslr83_wel_000092, Target: eng, Predicted: cym
|
| 895 |
+
Key: eng_openslr83_wel_000031, Target: eng, Predicted: cym
|
| 896 |
+
Key: eng_openslr83_wel_000063, Target: eng, Predicted: cym
|
| 897 |
+
Key: eng_openslr83_wel_000032, Target: eng, Predicted: glv
|
| 898 |
+
Key: eng_openslr83_wel_000065, Target: eng, Predicted: cym
|
| 899 |
+
Key: eng_openslr83_wel_000034, Target: eng, Predicted: gle
|
| 900 |
+
Key: eng_openslr83_wel_000066, Target: eng, Predicted: cym
|
| 901 |
+
Key: eng_voxpopuli_est_000017, Target: eng, Predicted: deu
|
| 902 |
+
Key: eng_voxpopuli_est_000008, Target: eng, Predicted: est
|
| 903 |
+
Key: eng_voxpopuli_hun_000015, Target: eng, Predicted: hun
|
| 904 |
+
Key: eng_voxpopuli_hun_000057, Target: eng, Predicted: hun
|
| 905 |
+
Key: eng_voxpopuli_pol_000021, Target: eng, Predicted: pol
|
| 906 |
+
Key: eng_voxpopuli_pol_000024, Target: eng, Predicted: pol
|
| 907 |
+
Key: eng_voxpopuli_nld_000034, Target: eng, Predicted: nld
|
| 908 |
+
Key: eng_voxpopuli_pol_000038, Target: eng, Predicted: pol
|
| 909 |
+
Key: eng_voxpopuli_ron_000015, Target: eng, Predicted: ron
|
| 910 |
+
Key: eng_voxpopuli_ron_000017, Target: eng, Predicted: ron
|
| 911 |
+
Key: guj_ms_speech_guj_000038, Target: guj, Predicted: mar
|
| 912 |
+
Key: spa_openslr_spa_arg_000015, Target: spa, Predicted: por
|
| 913 |
+
Key: spa_openslr_spa_arg_000018, Target: spa, Predicted: mlt
|
| 914 |
+
Key: guj_ms_speech_guj_000083, Target: guj, Predicted: hin
|
| 915 |
+
Key: spa_openslr_spa_arg_000022, Target: spa, Predicted: est
|
| 916 |
+
Key: spa_openslr_spa_arg_000062, Target: spa, Predicted: hau
|
| 917 |
+
Key: spa_openslr_spa_arg_000031, Target: spa, Predicted: isl
|
| 918 |
+
Key: spa_openslr_spa_arg_000003, Target: spa, Predicted: sot
|
| 919 |
+
Key: spa_openslr_spa_arg_000006, Target: spa, Predicted: ita
|
| 920 |
+
Key: spa_openslr_spa_arg_000079, Target: spa, Predicted: cym
|
| 921 |
+
Key: spa_openslr_spa_chi_000049, Target: spa, Predicted: ell
|
| 922 |
+
Key: spa_openslr_spa_arg_000114, Target: spa, Predicted: eus
|
| 923 |
+
Key: spa_openslr_spa_chi_000022, Target: spa, Predicted: grn
|
| 924 |
+
Key: spa_openslr_spa_chi_000037, Target: spa, Predicted: ita
|
| 925 |
+
Key: spa_openslr_spa_chi_000076, Target: spa, Predicted: sot
|
| 926 |
+
Key: spa_openslr_spa_col_000001, Target: spa, Predicted: por
|
| 927 |
+
Key: spa_openslr_spa_col_000007, Target: spa, Predicted: ita
|
| 928 |
+
Key: spa_openslr_spa_col_000013, Target: spa, Predicted: ita
|
| 929 |
+
Key: spa_openslr_spa_col_000081, Target: spa, Predicted: ron
|
| 930 |
+
Key: spa_openslr_spa_col_000056, Target: spa, Predicted: ita
|
| 931 |
+
Key: spa_openslr_spa_col_000099, Target: spa, Predicted: epo
|
| 932 |
+
Key: spa_openslr_spa_per_000018, Target: spa, Predicted: glg
|
| 933 |
+
Key: spa_openslr_spa_per_000063, Target: spa, Predicted: ina
|
| 934 |
+
Key: spa_openslr_spa_per_000033, Target: spa, Predicted: ina
|
| 935 |
+
Key: spa_openslr_spa_per_000065, Target: spa, Predicted: glg
|
| 936 |
+
Key: spa_openslr_spa_per_000035, Target: spa, Predicted: ita
|
| 937 |
+
Key: spa_openslr_spa_pue_000060, Target: spa, Predicted: ita
|
| 938 |
+
Key: spa_openslr_spa_pue_000094, Target: spa, Predicted: eus
|
| 939 |
+
Key: spa_openslr_spa_pue_000003, Target: spa, Predicted: ita
|
| 940 |
+
Key: spa_openslr_spa_pue_000071, Target: spa, Predicted: ita
|
| 941 |
+
Key: spa_openslr_spa_pue_000014, Target: spa, Predicted: gug
|
| 942 |
+
Key: spa_openslr_spa_pue_000051, Target: spa, Predicted: ron
|
| 943 |
+
Key: spa_openslr_spa_ven_000112, Target: spa, Predicted: eus
|
| 944 |
+
Key: spa_openslr_spa_ven_000116, Target: spa, Predicted: isl
|
| 945 |
+
Key: spa_openslr_spa_ven_000118, Target: spa, Predicted: ina
|
| 946 |
+
Key: tel_ms_speech_tel_000014, Target: tel, Predicted: mal
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang/lid_inference_test.log
ADDED
|
@@ -0,0 +1,302 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_ml_superb2_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_ml_superb2_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 02:17:42 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_ml_superb2_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_ml_superb2_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 02:18:18,542 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 02:18:35,714 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 02:18:36,278 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 02:19:16,352 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 02:19:50,349 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 1
|
| 278 |
+
[gpue04] 2025-06-02 02:20:29,160 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 2
|
| 279 |
+
[gpue04] 2025-06-02 02:21:04,869 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 3
|
| 280 |
+
[gpue04] 2025-06-02 02:21:41,255 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 4
|
| 281 |
+
[gpue04] 2025-06-02 02:22:12,467 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 5
|
| 282 |
+
[gpue04] 2025-06-02 02:22:52,319 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 6
|
| 283 |
+
[gpue04] 2025-06-02 02:23:31,893 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 7
|
| 284 |
+
[gpue04] 2025-06-02 02:24:09,283 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 8
|
| 285 |
+
[gpue04] 2025-06-02 02:24:49,938 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 9
|
| 286 |
+
[gpue04] 2025-06-02 02:25:28,601 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 10
|
| 287 |
+
[gpue04] 2025-06-02 02:26:08,097 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 11
|
| 288 |
+
[gpue04] 2025-06-02 02:26:47,335 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 12
|
| 289 |
+
[gpue04] 2025-06-02 02:27:25,789 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 13
|
| 290 |
+
[gpue04] 2025-06-02 02:27:56,679 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 14
|
| 291 |
+
[gpue04] 2025-06-02 02:28:40,096 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 15
|
| 292 |
+
[gpue04] 2025-06-02 02:29:19,221 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 16
|
| 293 |
+
[gpue04] 2025-06-02 02:29:55,785 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 17
|
| 294 |
+
[gpue04] 2025-06-02 02:30:34,585 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 18
|
| 295 |
+
[gpue04] 2025-06-02 02:31:08,302 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 19
|
| 296 |
+
[gpue04] 2025-06-02 02:31:41,628 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 20
|
| 297 |
+
[gpue04] 2025-06-02 02:32:15,682 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 21
|
| 298 |
+
[gpue04] 2025-06-02 02:32:50,538 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 22
|
| 299 |
+
[gpue04] 2025-06-02 02:33:12,317 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 300 |
+
[gpue04] 2025-06-02 02:33:12,318 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 301 |
+
# Accounting: time=931 threads=1
|
| 302 |
+
# Ended (code 0) at Mon Jun 2 02:33:13 CDT 2025, elapsed time 931 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_ml_superb2_lang_cross_train_all_no_filter_lang/results
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang/lid_inference_test.log
ADDED
|
@@ -0,0 +1,280 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_voxlingua107_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_voxlingua107_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 00:34:26 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_voxlingua107_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_voxlingua107_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 00:34:58,673 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 00:35:15,310 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 00:35:15,911 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 00:36:12,151 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 00:36:44,813 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 278 |
+
[gpue04] 2025-06-02 00:36:44,813 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 279 |
+
# Accounting: time=139 threads=1
|
| 280 |
+
# Ended (code 0) at Mon Jun 2 00:36:45 CDT 2025, elapsed time 139 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/dev_voxlingua107_lang_cross_train_all_no_filter_lang/results
ADDED
|
@@ -0,0 +1,126 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Accuracy: 94.41%
|
| 2 |
+
Macro Accuracy: 93.10%
|
| 3 |
+
Accuracy per Language:
|
| 4 |
+
slv: 100.00%
|
| 5 |
+
aze: 91.18%
|
| 6 |
+
cmn: 95.65%
|
| 7 |
+
srp: 71.43%
|
| 8 |
+
hun: 100.00%
|
| 9 |
+
fas: 89.00%
|
| 10 |
+
hye: 96.00%
|
| 11 |
+
urd: 80.77%
|
| 12 |
+
spa: 92.73%
|
| 13 |
+
est: 94.12%
|
| 14 |
+
tur: 98.61%
|
| 15 |
+
rus: 100.00%
|
| 16 |
+
ita: 100.00%
|
| 17 |
+
ara: 95.00%
|
| 18 |
+
nor: 54.17%
|
| 19 |
+
lav: 98.95%
|
| 20 |
+
ukr: 100.00%
|
| 21 |
+
swe: 96.00%
|
| 22 |
+
deu: 93.90%
|
| 23 |
+
ell: 80.00%
|
| 24 |
+
isl: 100.00%
|
| 25 |
+
nno: 100.00%
|
| 26 |
+
pol: 100.00%
|
| 27 |
+
hrv: 75.00%
|
| 28 |
+
dan: 93.94%
|
| 29 |
+
fra: 97.00%
|
| 30 |
+
nld: 95.00%
|
| 31 |
+
mkd: 100.00%
|
| 32 |
+
jpn: 97.62%
|
| 33 |
+
por: 100.00%
|
| 34 |
+
fin: 97.85%
|
| 35 |
+
eng: 96.25%
|
| 36 |
+
lit: 92.31%
|
| 37 |
+
Key: ara_cwvnYGInNNg__U__S229---0661.830-0670.560.wav, Target: ara, Predicted: heb
|
| 38 |
+
Key: aze_3UUShvAQxQY__U__S199---1315.800-1322.250.wav, Target: aze, Predicted: tur
|
| 39 |
+
Key: aze_3qOGhbHQuAc__U__S157---1061.380-1066.120.wav, Target: aze, Predicted: tur
|
| 40 |
+
Key: ara_AfS6C1PXAdQ__U__S20---0104.730-0111.410.wav, Target: ara, Predicted: hau
|
| 41 |
+
Key: ara_TPWwuy20K_c__U__S70---0466.380-0472.600.wav, Target: ara, Predicted: hau
|
| 42 |
+
Key: ara_XplxxijLuFI__U__S0---0372.560-0375.230.wav, Target: ara, Predicted: heb
|
| 43 |
+
Key: aze_bYKK1m78ecE__U__S91---0592.500-0596.130.wav, Target: aze, Predicted: fao
|
| 44 |
+
Key: ara_tl39W93P0r4__U__S32---0282.970-0286.530.wav, Target: ara, Predicted: dan
|
| 45 |
+
Key: dan_Nyl6CuW6Qfk__U__S26---0557.690-0560.120.wav, Target: dan, Predicted: nor
|
| 46 |
+
Key: dan_ONZC1wL5hBw__U__S100---1407.470-1417.260.wav, Target: dan, Predicted: nno
|
| 47 |
+
Key: cmn_ZUzq_TIfYL4__U__S39---0442.690-0454.380.wav, Target: cmn, Predicted: yue
|
| 48 |
+
Key: dan_SbE2FKexCW4__U__S62---0546.280-0551.260.wav, Target: dan, Predicted: isl
|
| 49 |
+
Key: dan_E3vuA0Mqk1Q__U__S13---0072.140-0083.530.wav, Target: dan, Predicted: nno
|
| 50 |
+
Key: dan_ZZD1qu4ScPg__U__S14---0166.700-0176.010.wav, Target: dan, Predicted: hat
|
| 51 |
+
Key: dan_yEEcGssW0Qg__U__S112---1016.050-1020.110.wav, Target: dan, Predicted: deu
|
| 52 |
+
Key: eng_4y7p9R2No-4__U__S12---0266.390-0268.460.wav, Target: eng, Predicted: gle
|
| 53 |
+
Key: deu_4zCzyVjLkcc__U__S0---0123.750-0127.540.wav, Target: deu, Predicted: ltz
|
| 54 |
+
Key: deu_8L3k8XNTtNA__U__S100---2689.380-2692.180.wav, Target: deu, Predicted: fin
|
| 55 |
+
Key: deu_9O2haSYzftE__U__S0---0000.000-0004.200.wav, Target: deu, Predicted: yid
|
| 56 |
+
Key: deu_cMZO2zXTBv8__U__S100---0341.910-0344.350.wav, Target: deu, Predicted: yid
|
| 57 |
+
Key: ell_bw_mDLVdgtY__U__S18---0119.750-0127.200.wav, Target: ell, Predicted: isl
|
| 58 |
+
Key: deu_eyZqRcgGkiY__U__S126---1155.890-1162.390.wav, Target: deu, Predicted: nor
|
| 59 |
+
Key: eng_K977aQQpAVk__U__S106---0393.230-0397.100.wav, Target: eng, Predicted: cym
|
| 60 |
+
Key: est_EtWRBtavckY__U__S116---1906.220-1908.810.wav, Target: est, Predicted: fin
|
| 61 |
+
Key: eng_eQXHc-tJMXM__U__S11---1066.230-1077.360.wav, Target: eng, Predicted: cym
|
| 62 |
+
Key: est_gTl2GSJBxNw__U__S0---0000.000-0008.420.wav, Target: est, Predicted: tur
|
| 63 |
+
Key: est_5gWpxiFOouQ__U__S2---1635.950-1646.620.wav, Target: est, Predicted: tel
|
| 64 |
+
Key: est_7vZIuc9qumg__U__S21---0145.690-0153.320.wav, Target: est, Predicted: dan
|
| 65 |
+
Key: est_E05LlgvSMg0__U__S156---1171.030-1172.780.wav, Target: est, Predicted: fin
|
| 66 |
+
Key: fas_SMcjja_krx4__U__S2---0012.190-0021.730.wav, Target: fas, Predicted: tgk
|
| 67 |
+
Key: fas_4sboRMmC2TM__U__S212---1293.790-1307.370.wav, Target: fas, Predicted: tgk
|
| 68 |
+
Key: fas_XUGZwtXgvRA__U__S154---0993.540-0997.340.wav, Target: fas, Predicted: san
|
| 69 |
+
Key: fas_9k1oVW4Ynyw__U__S15---0097.430-0101.630.wav, Target: fas, Predicted: sqi
|
| 70 |
+
Key: fas_nPts67VQKRQ__U__S250---1629.010-1632.750.wav, Target: fas, Predicted: hat
|
| 71 |
+
Key: fas_EjSRRddYuc4__U__S58---0355.980-0359.590.wav, Target: fas, Predicted: lat
|
| 72 |
+
Key: fas_pt166R7v8kU__U__S13---0267.910-0272.370.wav, Target: fas, Predicted: tgk
|
| 73 |
+
Key: fin_C4H2GlJRkNU__U__S100---1604.910-1610.210.wav, Target: fin, Predicted: est
|
| 74 |
+
Key: fas_x_Di4cq4ixM__U__S100---1353.580-1358.390.wav, Target: fas, Predicted: pus
|
| 75 |
+
Key: fas_gLoBPMrad3E__U__S14---0097.650-0102.010.wav, Target: fas, Predicted: yid
|
| 76 |
+
Key: fas_zZCjOs-WwKo__U__S195---1357.430-1377.010.wav, Target: fas, Predicted: aze
|
| 77 |
+
Key: fas_QYwCDYVxjpo__U__S68---0428.220-0432.740.wav, Target: fas, Predicted: pus
|
| 78 |
+
Key: fin_S_VWbBtBey4__U__S0---0308.380-0310.650.wav, Target: fin, Predicted: glv
|
| 79 |
+
Key: fra_SLfpp704KI8__U__S57---0368.470-0372.910.wav, Target: fra, Predicted: rus
|
| 80 |
+
Key: fra_Lo_JX-8KHEw__U__S151---0284.430-0299.020.wav, Target: fra, Predicted: lin
|
| 81 |
+
Key: hrv_Jntmbw5_vOI__U__S291---0379.300-0383.970.wav, Target: hrv, Predicted: srp
|
| 82 |
+
Key: fra_jjEvNgbuptE__U__S103---0990.080-0997.340.wav, Target: fra, Predicted: hat
|
| 83 |
+
Key: hye_PcLE4N63O9M__U__S352---2333.340-2337.540.wav, Target: hye, Predicted: yid
|
| 84 |
+
Key: hye_Qmo3P38Ytek__U__S32---0245.460-0249.320.wav, Target: hye, Predicted: jav
|
| 85 |
+
Key: hye_qkMM0rYsa0c__U__S276---1611.690-1615.350.wav, Target: hye, Predicted: sqi
|
| 86 |
+
Key: hye_um6xT5Gjgus__U__S194---1224.460-1234.130.wav, Target: hye, Predicted: lat
|
| 87 |
+
Key: jpn_rQPhM6wNQwc__U__S47---0317.270-0323.120.wav, Target: jpn, Predicted: est
|
| 88 |
+
Key: lit_3svAywrL0_I__U__S149---0461.370-0464.980.wav, Target: lit, Predicted: por
|
| 89 |
+
Key: nld_0LhAXOxz-JU__U__S32---0243.280-0247.880.wav, Target: nld, Predicted: afr
|
| 90 |
+
Key: nld_0LhAXOxz-JU__U__S396---2475.670-2488.950.wav, Target: nld, Predicted: afr
|
| 91 |
+
Key: nld_2C5HehL-Fx0__U__S101---1125.890-1131.720.wav, Target: nld, Predicted: ltz
|
| 92 |
+
Key: lav_DWPBBIdz0Mo__U__S52---0339.380-0356.600.wav, Target: lav, Predicted: ukr
|
| 93 |
+
Key: nld_QflBX7-rF9c__U__S106---0919.840-0926.610.wav, Target: nld, Predicted: afr
|
| 94 |
+
Key: nor_HW_49WuFloM__U__S106---0621.590-0626.130.wav, Target: nor, Predicted: nno
|
| 95 |
+
Key: nor_I1vUI8va8Yc__U__S49---0294.940-0302.560.wav, Target: nor, Predicted: nno
|
| 96 |
+
Key: nld_7AZTxaq_37U__U__S29---0226.530-0237.250.wav, Target: nld, Predicted: afr
|
| 97 |
+
Key: nor_UxHL_uql05E__U__S118---0587.340-0598.790.wav, Target: nor, Predicted: nno
|
| 98 |
+
Key: nor_XC4Ffj9XDls__U__S105---0636.770-0655.460.wav, Target: nor, Predicted: nno
|
| 99 |
+
Key: nor_tV3Le8SUz_0__U__S276---1831.870-1841.550.wav, Target: nor, Predicted: nno
|
| 100 |
+
Key: nor_xVNA15ifyIw__U__S494---0311.220-0317.160.wav, Target: nor, Predicted: nno
|
| 101 |
+
Key: nor_ySVkmT8SgNM__U__S345---2245.790-2255.930.wav, Target: nor, Predicted: nno
|
| 102 |
+
Key: nor_0eQvHBz2Zb0__U__S0---0000.000-0018.430.wav, Target: nor, Predicted: nno
|
| 103 |
+
Key: nor_1KFP5wVtthQ__U__S130---0511.650-0521.160.wav, Target: nor, Predicted: nno
|
| 104 |
+
Key: nor_41P9Uue3YbQ__U__S38---0255.810-0264.280.wav, Target: nor, Predicted: nno
|
| 105 |
+
Key: nor_97e9pEtHAxg__U__S32---0201.830-0210.250.wav, Target: nor, Predicted: nno
|
| 106 |
+
Key: spa_BApoyHcbdls__U__S286---1705.860-1722.550.wav, Target: spa, Predicted: ast
|
| 107 |
+
Key: swe_CizHFWTDSnU__U__S113---0867.420-0878.560.wav, Target: swe, Predicted: nor
|
| 108 |
+
Key: spa_z5b-CjOOhK8__U__S251---1701.420-1708.280.wav, Target: spa, Predicted: glg
|
| 109 |
+
Key: srp_8dvIaAOLlGA__U__S216---1326.410-1335.490.wav, Target: srp, Predicted: hrv
|
| 110 |
+
Key: spa_UYBcNrx8kvQ__U__S186---2292.670-2299.590.wav, Target: spa, Predicted: kor
|
| 111 |
+
Key: srp_rkQhxxO5Qt4__U__S109---0820.610-0826.190.wav, Target: srp, Predicted: bos
|
| 112 |
+
Key: spa_Y0mzNQqBR3A__U__S151---1160.320-1166.480.wav, Target: spa, Predicted: glg
|
| 113 |
+
Key: swe_0x4xb4AaTy0__U__S0---0301.450-0319.780.wav, Target: swe, Predicted: nno
|
| 114 |
+
Key: tur_4C-efpD-DlM__U__S7---0050.890-0055.080.wav, Target: tur, Predicted: war
|
| 115 |
+
Key: swe_ilhngbAuxvs__U__S14---2441.740-2445.720.wav, Target: swe, Predicted: nno
|
| 116 |
+
Key: swe_wMAAiJhj0VA__U__S100---0564.840-0568.420.wav, Target: swe, Predicted: nno
|
| 117 |
+
Key: urd_o3awRytwrUY__U__S1---0290.850-0306.430.wav, Target: urd, Predicted: fas
|
| 118 |
+
Key: urd_pYId2x4cutY__U__S151---1539.990-1542.820.wav, Target: urd, Predicted: hin
|
| 119 |
+
Key: urd_J7RizO2mvm4__U__S3---0042.600-0051.180.wav, Target: urd, Predicted: san
|
| 120 |
+
Key: urd_ySjOb5uaA-U__U__S107---0336.690-0353.110.wav, Target: urd, Predicted: cym
|
| 121 |
+
Key: urd_N59t4A1mxfA__U__S101---0715.390-0720.460.wav, Target: urd, Predicted: snd
|
| 122 |
+
Key: urd_Tj2pngm_vuA__U__S1---0070.400-0089.660.wav, Target: urd, Predicted: hin
|
| 123 |
+
Key: urd_U_h8Bgywxrc__U__S0---0222.420-0227.610.wav, Target: urd, Predicted: hin
|
| 124 |
+
Key: urd_eTfyAm6CFB0__U__S25---0439.860-0446.680.wav, Target: urd, Predicted: hin
|
| 125 |
+
Key: urd_8VDrhDx37OA__U__S12---0094.320-0106.080.wav, Target: urd, Predicted: hin
|
| 126 |
+
Key: urd_n3l7PavcOFk__U__S0---0379.380-0397.930.wav, Target: urd, Predicted: hin
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang/lid_inference_test.log
ADDED
|
@@ -0,0 +1,356 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/test_fleurs_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/test_fleurs_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 00:54:21 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/test_fleurs_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/test_fleurs_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 00:54:40,331 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 00:54:51,981 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 00:54:52,516 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 00:55:50,135 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 00:56:51,199 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 1
|
| 278 |
+
[gpue04] 2025-06-02 00:57:39,967 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 2
|
| 279 |
+
[gpue04] 2025-06-02 00:58:38,305 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 3
|
| 280 |
+
[gpue04] 2025-06-02 00:59:50,779 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 4
|
| 281 |
+
[gpue04] 2025-06-02 01:00:54,111 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 5
|
| 282 |
+
[gpue04] 2025-06-02 01:01:48,017 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 6
|
| 283 |
+
[gpue04] 2025-06-02 01:02:41,062 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 7
|
| 284 |
+
[gpue04] 2025-06-02 01:03:46,378 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 8
|
| 285 |
+
[gpue04] 2025-06-02 01:04:44,682 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 9
|
| 286 |
+
[gpue04] 2025-06-02 01:05:39,283 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 10
|
| 287 |
+
[gpue04] 2025-06-02 01:06:43,893 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 11
|
| 288 |
+
[gpue04] 2025-06-02 01:07:51,015 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 12
|
| 289 |
+
[gpue04] 2025-06-02 01:08:54,020 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 13
|
| 290 |
+
[gpue04] 2025-06-02 01:09:52,678 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 14
|
| 291 |
+
[gpue04] 2025-06-02 01:10:43,497 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 15
|
| 292 |
+
[gpue04] 2025-06-02 01:11:46,070 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 16
|
| 293 |
+
[gpue04] 2025-06-02 01:13:04,592 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 17
|
| 294 |
+
[gpue04] 2025-06-02 01:14:19,368 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 18
|
| 295 |
+
[gpue04] 2025-06-02 01:15:13,278 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 19
|
| 296 |
+
[gpue04] 2025-06-02 01:16:21,130 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 20
|
| 297 |
+
[gpue04] 2025-06-02 01:17:22,122 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 21
|
| 298 |
+
[gpue04] 2025-06-02 01:18:11,762 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 22
|
| 299 |
+
[gpue04] 2025-06-02 01:19:30,749 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 23
|
| 300 |
+
[gpue04] 2025-06-02 01:20:24,887 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 24
|
| 301 |
+
[gpue04] 2025-06-02 01:21:15,131 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 25
|
| 302 |
+
[gpue04] 2025-06-02 01:22:09,711 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 26
|
| 303 |
+
[gpue04] 2025-06-02 01:23:05,444 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 27
|
| 304 |
+
[gpue04] 2025-06-02 01:24:41,530 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 28
|
| 305 |
+
[gpue04] 2025-06-02 01:25:45,696 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 29
|
| 306 |
+
[gpue04] 2025-06-02 01:26:55,280 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 30
|
| 307 |
+
[gpue04] 2025-06-02 01:27:56,405 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 31
|
| 308 |
+
[gpue04] 2025-06-02 01:29:15,643 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 32
|
| 309 |
+
[gpue04] 2025-06-02 01:30:14,980 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 33
|
| 310 |
+
[gpue04] 2025-06-02 01:31:18,218 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 34
|
| 311 |
+
[gpue04] 2025-06-02 01:32:29,009 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 35
|
| 312 |
+
[gpue04] 2025-06-02 01:33:37,784 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 36
|
| 313 |
+
[gpue04] 2025-06-02 01:34:36,080 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 37
|
| 314 |
+
[gpue04] 2025-06-02 01:35:38,726 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 38
|
| 315 |
+
[gpue04] 2025-06-02 01:36:47,523 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 39
|
| 316 |
+
[gpue04] 2025-06-02 01:37:52,504 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 40
|
| 317 |
+
[gpue04] 2025-06-02 01:38:45,251 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 41
|
| 318 |
+
[gpue04] 2025-06-02 01:40:00,452 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 42
|
| 319 |
+
[gpue04] 2025-06-02 01:41:11,257 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 43
|
| 320 |
+
[gpue04] 2025-06-02 01:42:19,878 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 44
|
| 321 |
+
[gpue04] 2025-06-02 01:43:18,828 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 45
|
| 322 |
+
[gpue04] 2025-06-02 01:44:21,603 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 46
|
| 323 |
+
[gpue04] 2025-06-02 01:45:16,599 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 47
|
| 324 |
+
[gpue04] 2025-06-02 01:47:00,696 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 48
|
| 325 |
+
[gpue04] 2025-06-02 01:48:00,085 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 49
|
| 326 |
+
[gpue04] 2025-06-02 01:49:05,199 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 50
|
| 327 |
+
[gpue04] 2025-06-02 01:49:59,112 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 51
|
| 328 |
+
[gpue04] 2025-06-02 01:51:30,912 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 52
|
| 329 |
+
[gpue04] 2025-06-02 01:52:53,232 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 53
|
| 330 |
+
[gpue04] 2025-06-02 01:54:03,035 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 54
|
| 331 |
+
[gpue04] 2025-06-02 01:55:01,946 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 55
|
| 332 |
+
[gpue04] 2025-06-02 01:55:51,439 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 56
|
| 333 |
+
[gpue04] 2025-06-02 01:56:50,653 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 57
|
| 334 |
+
[gpue04] 2025-06-02 01:57:42,745 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 58
|
| 335 |
+
[gpue04] 2025-06-02 01:58:36,260 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 59
|
| 336 |
+
[gpue04] 2025-06-02 01:59:29,171 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 60
|
| 337 |
+
[gpue04] 2025-06-02 02:00:27,515 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 61
|
| 338 |
+
[gpue04] 2025-06-02 02:01:31,325 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 62
|
| 339 |
+
[gpue04] 2025-06-02 02:02:34,747 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 63
|
| 340 |
+
[gpue04] 2025-06-02 02:03:36,231 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 64
|
| 341 |
+
[gpue04] 2025-06-02 02:04:29,492 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 65
|
| 342 |
+
[gpue04] 2025-06-02 02:05:28,729 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 66
|
| 343 |
+
[gpue04] 2025-06-02 02:06:31,995 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 67
|
| 344 |
+
[gpue04] 2025-06-02 02:07:34,329 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 68
|
| 345 |
+
[gpue04] 2025-06-02 02:08:36,453 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 69
|
| 346 |
+
[gpue04] 2025-06-02 02:09:32,760 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 70
|
| 347 |
+
[gpue04] 2025-06-02 02:11:06,537 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 71
|
| 348 |
+
[gpue04] 2025-06-02 02:12:03,962 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 72
|
| 349 |
+
[gpue04] 2025-06-02 02:13:08,461 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 73
|
| 350 |
+
[gpue04] 2025-06-02 02:14:18,704 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 74
|
| 351 |
+
[gpue04] 2025-06-02 02:15:32,453 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 75
|
| 352 |
+
[gpue04] 2025-06-02 02:16:33,681 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 76
|
| 353 |
+
[gpue04] 2025-06-02 02:17:40,005 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 354 |
+
[gpue04] 2025-06-02 02:17:40,006 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 355 |
+
# Accounting: time=5000 threads=1
|
| 356 |
+
# Ended (code 0) at Mon Jun 2 02:17:41 CDT 2025, elapsed time 5000 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_fleurs_lang_cross_train_all_no_filter_lang/results
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang/lid_inference_test.log
ADDED
|
@@ -0,0 +1,295 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/test_voxpopuli_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/test_voxpopuli_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 00:36:46 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/test_voxpopuli_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/test_voxpopuli_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 00:37:06,533 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 00:37:18,559 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 00:37:19,091 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 00:38:18,371 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 00:39:13,446 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 1
|
| 278 |
+
[gpue04] 2025-06-02 00:40:09,054 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 2
|
| 279 |
+
[gpue04] 2025-06-02 00:41:05,743 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 3
|
| 280 |
+
[gpue04] 2025-06-02 00:42:03,918 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 4
|
| 281 |
+
[gpue04] 2025-06-02 00:43:01,470 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 5
|
| 282 |
+
[gpue04] 2025-06-02 00:44:00,210 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 6
|
| 283 |
+
[gpue04] 2025-06-02 00:45:05,238 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 7
|
| 284 |
+
[gpue04] 2025-06-02 00:46:04,557 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 8
|
| 285 |
+
[gpue04] 2025-06-02 00:47:20,175 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 9
|
| 286 |
+
[gpue04] 2025-06-02 00:48:16,561 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 10
|
| 287 |
+
[gpue04] 2025-06-02 00:49:08,874 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 11
|
| 288 |
+
[gpue04] 2025-06-02 00:50:08,094 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 12
|
| 289 |
+
[gpue04] 2025-06-02 00:51:11,746 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 13
|
| 290 |
+
[gpue04] 2025-06-02 00:52:10,238 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 14
|
| 291 |
+
[gpue04] 2025-06-02 00:53:15,344 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 15
|
| 292 |
+
[gpue04] 2025-06-02 00:54:19,104 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 293 |
+
[gpue04] 2025-06-02 00:54:19,105 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 294 |
+
# Accounting: time=1054 threads=1
|
| 295 |
+
# Ended (code 0) at Mon Jun 2 00:54:20 CDT 2025, elapsed time 1054 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/inference/valid.accuracy.best/test_voxpopuli_lang_cross_train_all_no_filter_lang/results
ADDED
|
@@ -0,0 +1,197 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Accuracy: 98.95%
|
| 2 |
+
Macro Accuracy: 99.09%
|
| 3 |
+
Accuracy per Language:
|
| 4 |
+
hrv: 99.40%
|
| 5 |
+
ces: 98.75%
|
| 6 |
+
spa: 97.64%
|
| 7 |
+
hun: 98.83%
|
| 8 |
+
pol: 98.31%
|
| 9 |
+
slk: 97.35%
|
| 10 |
+
nld: 99.12%
|
| 11 |
+
eng: 99.51%
|
| 12 |
+
est: 100.00%
|
| 13 |
+
ron: 99.49%
|
| 14 |
+
slv: 99.35%
|
| 15 |
+
ita: 99.24%
|
| 16 |
+
lit: 100.00%
|
| 17 |
+
fra: 99.31%
|
| 18 |
+
deu: 99.29%
|
| 19 |
+
fin: 99.79%
|
| 20 |
+
Key: ces_20110609-0900-PLENARY-4-cs_20110609-11:20:13_0, Target: ces, Predicted: cym
|
| 21 |
+
Key: ces_20130610-0900-PLENARY-15-cs_20130610-20:51:10_5, Target: ces, Predicted: slk
|
| 22 |
+
Key: ces_20141126-0900-PLENARY-14-cs_20141126-18:28:01_1, Target: ces, Predicted: slk
|
| 23 |
+
Key: ces_20150209-0900-PLENARY-11-cs_20150209-21:09:35_2, Target: ces, Predicted: hun
|
| 24 |
+
Key: ces_20170403-0900-PLENARY-17-cs_20170403-20:24:45_0, Target: ces, Predicted: deu
|
| 25 |
+
Key: ces_20180614-0900-PLENARY-5-cs_20180614-11:09:11_0, Target: ces, Predicted: pol
|
| 26 |
+
Key: ces_20180614-0900-PLENARY-cs_20180614-11:09:11_0, Target: ces, Predicted: pol
|
| 27 |
+
Key: ces_20180612-0900-PLENARY-14-cs_20180612-17:15:35_13, Target: ces, Predicted: pol
|
| 28 |
+
Key: deu_20090202-0900-PLENARY-13-de_20090202-22:12:47_16, Target: deu, Predicted: ces
|
| 29 |
+
Key: ces_20180912-0900-PLENARY-widetrim-cs_20180912-16:32:19_1, Target: ces, Predicted: fra
|
| 30 |
+
Key: ces_20180912-0900-PLENARY-widetrim-cs_20180912-16:32:19_2, Target: ces, Predicted: fra
|
| 31 |
+
Key: ces_20180912-0900-PLENARY-widetrim-cs_20180912-16:32:19_3, Target: ces, Predicted: fra
|
| 32 |
+
Key: ces_20180912-0900-PLENARY-widetrim-cs_20180912-16:32:19_5, Target: ces, Predicted: fra
|
| 33 |
+
Key: ces_20180912-0900-PLENARY-widetrim-cs_20180912-16:32:19_6, Target: ces, Predicted: fra
|
| 34 |
+
Key: ces_20180612-0900-PLENARY-cs_20180612-17:15:35_14, Target: ces, Predicted: pol
|
| 35 |
+
Key: deu_20111116-0900-PLENARY-3-de_20111116-11:38:53_0, Target: deu, Predicted: ltz
|
| 36 |
+
Key: deu_20111024-0900-PLENARY-10-de_20111024-17:46:08_0, Target: deu, Predicted: eng
|
| 37 |
+
Key: deu_20131021-0900-PLENARY-10-de_20131021-19:11:07_0, Target: deu, Predicted: ell
|
| 38 |
+
Key: deu_20131022-0900-PLENARY-4-de_20131022-09:24:30_14, Target: deu, Predicted: hrv
|
| 39 |
+
Key: deu_20131022-0900-PLENARY-4-de_20131022-08:42:26_10, Target: deu, Predicted: nld
|
| 40 |
+
Key: deu_20160511-0900-PLENARY-14-de_20160511-15:48:52_1, Target: deu, Predicted: fra
|
| 41 |
+
Key: deu_20170314-0900-PLENARY-13-de_20170314-20:56:04_2, Target: deu, Predicted: nld
|
| 42 |
+
Key: deu_20170613-0900-PLENARY-20-de_20170613-22:55:01_13, Target: deu, Predicted: slv
|
| 43 |
+
Key: deu_20171025-0900-PLENARY-21-de_20171025-19:19:40_0, Target: deu, Predicted: ina
|
| 44 |
+
Key: deu_20180611-0900-PLENARY-11-de_20180611-18:10:02_0, Target: deu, Predicted: ron
|
| 45 |
+
Key: deu_20180912-0900-PLENARY-widetrim-de_20180912-16:34:37_1, Target: deu, Predicted: ell
|
| 46 |
+
Key: deu_20180912-0900-PLENARY-widetrim-de_20180912-19:37:22_2, Target: deu, Predicted: fra
|
| 47 |
+
Key: deu_20180912-0900-PLENARY-widetrim-de_20180912-20:47:17_1, Target: deu, Predicted: slk
|
| 48 |
+
Key: eng_20110310-0900-PLENARY-5-en_20110310-10:53:26_3, Target: eng, Predicted: hun
|
| 49 |
+
Key: eng_20120912-0900-PLENARY-9-en_20120912-16:27:37_7, Target: eng, Predicted: slv
|
| 50 |
+
Key: eng_20131022-0900-PLENARY-20-en_20131022-22:05:54_6, Target: eng, Predicted: nld
|
| 51 |
+
Key: eng_20131023-0900-PLENARY-11-en_20131023-17:16:39_6, Target: eng, Predicted: ces
|
| 52 |
+
Key: eng_20171114-0900-PLENARY-14-en_20171114-15:46:05_9, Target: eng, Predicted: ron
|
| 53 |
+
Key: eng_20180911-0900-PLENARY-witholdRO-en_20180911-18:37:21_2, Target: eng, Predicted: deu
|
| 54 |
+
Key: eng_20180613-0900-PLENARY-15-en_20180613-15:21:04_16, Target: eng, Predicted: deu
|
| 55 |
+
Key: eng_20180613-0900-PLENARY-15-en_20180613-15:21:04_6, Target: eng, Predicted: deu
|
| 56 |
+
Key: eng_20200914-0900-PLENARY-en_20200914-21:39:43_1, Target: eng, Predicted: slv
|
| 57 |
+
Key: fin_20140313-0900-PLENARY-14-fi_20140313-13:36:53_0, Target: fin, Predicted: ell
|
| 58 |
+
Key: fra_20111130-0900-PLENARY-11-fr_20111130-16:35:45_18, Target: fra, Predicted: nld
|
| 59 |
+
Key: fra_20111130-0900-PLENARY-11-fr_20111130-16:35:45_19, Target: fra, Predicted: pol
|
| 60 |
+
Key: fra_20131022-0900-PLENARY-14-fr_20131022-16:32:57_5, Target: fra, Predicted: ron
|
| 61 |
+
Key: fra_20140225-0900-PLENARY-11-fr_20140225-15:56:55_0, Target: fra, Predicted: deu
|
| 62 |
+
Key: fra_20140312-0900-PLENARY-15-fr_20140312-20:54:27_9, Target: fra, Predicted: ell
|
| 63 |
+
Key: fra_20160704-0900-PLENARY-13-fr_20160704-20:03:29_0, Target: fra, Predicted: nno
|
| 64 |
+
Key: fra_20170912-0900-PLENARY-21-fr_20170912-20:09:57_0, Target: fra, Predicted: deu
|
| 65 |
+
Key: fra_20180530-0900-PLENARY-3-fr_20180530-11:02:02_4, Target: fra, Predicted: ron
|
| 66 |
+
Key: fra_20180912-0900-PLENARY-widetrim-fr_20180912-19:32:09_2, Target: fra, Predicted: slk
|
| 67 |
+
Key: fra_20180912-0900-PLENARY-widetrim-fr_20180912-19:32:09_3, Target: fra, Predicted: fin
|
| 68 |
+
Key: fra_20180912-0900-PLENARY-widetrim-fr_20180912-19:32:09_5, Target: fra, Predicted: fin
|
| 69 |
+
Key: fra_20201019-0900-PLENARY-fr_20201019-19:35:21_8, Target: fra, Predicted: ita
|
| 70 |
+
Key: hrv_20140114-0900-PLENARY-6-hr_20140114-13:40:47_0, Target: hrv, Predicted: eng
|
| 71 |
+
Key: hrv_20151216-0900-PLENARY-16-hr_20151216-20:01:08_3, Target: hrv, Predicted: ita
|
| 72 |
+
Key: hrv_20170213-0900-PLENARY-18-hr_20170213-22:14:46_3, Target: hrv, Predicted: nld
|
| 73 |
+
Key: hun_20090203-0900-PLENARY-13-hu_20090203-21:55:15_8, Target: hun, Predicted: slv
|
| 74 |
+
Key: hun_20090204-0900-PLENARY-3-hu_20090204-10:53:37_0, Target: hun, Predicted: fra
|
| 75 |
+
Key: hrv_20181022-0900-PLENARY-hr_20181022-22:55:28_7, Target: hrv, Predicted: bos
|
| 76 |
+
Key: hun_20110117-0900-PLENARY-14-hu_20110117-21:41:35_0, Target: hun, Predicted: ita
|
| 77 |
+
Key: hun_20120313-0900-PLENARY-10-hu_20120313-17:33:54_0, Target: hun, Predicted: isl
|
| 78 |
+
Key: hun_20120313-0900-PLENARY-6-hu_20120313-12:45:53_0, Target: hun, Predicted: ita
|
| 79 |
+
Key: hun_20160414-0900-PLENARY-10-hu_20160414-13:26:41_0, Target: hun, Predicted: fin
|
| 80 |
+
Key: hun_20171212-0900-PLENARY-15-hu_20171212-16:18:30_0, Target: hun, Predicted: ron
|
| 81 |
+
Key: hun_20170705-0900-PLENARY-8-hu_20170705-12:19:50_0, Target: hun, Predicted: fra
|
| 82 |
+
Key: ita_20130521-0900-PLENARY-10-it_20130521-17:59:38_11, Target: ita, Predicted: spa
|
| 83 |
+
Key: hun_20180912-0900-PLENARY-widetrim-hu_20180912-22:41:34_2, Target: hun, Predicted: eng
|
| 84 |
+
Key: hun_20180912-0900-PLENARY-widetrim-hu_20180912-22:41:34_3, Target: hun, Predicted: eng
|
| 85 |
+
Key: hun_20180912-0900-PLENARY-widetrim-hu_20180912-22:41:34_4, Target: hun, Predicted: eng
|
| 86 |
+
Key: hun_20180912-0900-PLENARY-widetrim-hu_20180912-22:41:34_5, Target: hun, Predicted: eng
|
| 87 |
+
Key: hun_20180912-0900-PLENARY-widetrim-hu_20180912-22:41:34_6, Target: hun, Predicted: eng
|
| 88 |
+
Key: ita_20140702-0900-PLENARY-12-it_20140702-16:59:58_1, Target: ita, Predicted: spa
|
| 89 |
+
Key: ita_20140225-0900-PLENARY-6-it_20140225-13:50:00_0, Target: ita, Predicted: fra
|
| 90 |
+
Key: ita_20140226-0900-PLENARY-3-it_20140226-09:26:22_1, Target: ita, Predicted: fra
|
| 91 |
+
Key: ita_20140226-0900-PLENARY-3-it_20140226-09:26:22_7, Target: ita, Predicted: spa
|
| 92 |
+
Key: ita_20141021-0900-PLENARY-4-it_20141021-09:37:26_4, Target: ita, Predicted: ces
|
| 93 |
+
Key: ita_20151125-0900-PLENARY-7-it_20151125-12:02:07_15, Target: ita, Predicted: deu
|
| 94 |
+
Key: ita_20151125-0900-PLENARY-7-it_20151125-12:02:07_12, Target: ita, Predicted: ces
|
| 95 |
+
Key: ita_20180613-0900-PLENARY-17-it_20180613-17:05:13_2, Target: ita, Predicted: spa
|
| 96 |
+
Key: nld_20090311-0900-PLENARY-20-nl_20090311-21:13:31_16, Target: nld, Predicted: ces
|
| 97 |
+
Key: nld_20101019-0900-PLENARY-11-nl_20101019-18:14:25_20, Target: nld, Predicted: azz
|
| 98 |
+
Key: nld_20100120-0900-PLENARY-13-nl_20100120-21:54:24_0, Target: nld, Predicted: ell
|
| 99 |
+
Key: nld_20140116-0900-PLENARY-7-nl_20140116-12:49:10_0, Target: nld, Predicted: hun
|
| 100 |
+
Key: nld_20141021-0900-PLENARY-11-nl_20141021-17:19:09_13, Target: nld, Predicted: fra
|
| 101 |
+
Key: nld_20141021-0900-PLENARY-16-nl_20141021-22:35:16_6, Target: nld, Predicted: swe
|
| 102 |
+
Key: nld_20170613-0900-PLENARY-13-nl_20170613-16:02:56_0, Target: nld, Predicted: fra
|
| 103 |
+
Key: nld_20180315-0900-PLENARY-3-nl_20180315-09:28:48_0, Target: nld, Predicted: hun
|
| 104 |
+
Key: nld_20180912-0900-PLENARY-widetrim-nl_20180912-16:04:33_1, Target: nld, Predicted: deu
|
| 105 |
+
Key: nld_20180912-0900-PLENARY-widetrim-nl_20180912-16:04:33_3, Target: nld, Predicted: spa
|
| 106 |
+
Key: pol_20090324-0900-PLENARY-3-pl_20090324-09:54:57_10, Target: pol, Predicted: slk
|
| 107 |
+
Key: pol_20091124-0900-PLENARY-19-pl_20091124-23:31:52_0, Target: pol, Predicted: ukr
|
| 108 |
+
Key: pol_20091124-0900-PLENARY-19-pl_20091124-23:31:52_1, Target: pol, Predicted: ukr
|
| 109 |
+
Key: pol_20091124-0900-PLENARY-19-pl_20091124-23:31:52_2, Target: pol, Predicted: ukr
|
| 110 |
+
Key: pol_20091124-0900-PLENARY-19-pl_20091124-23:31:52_4, Target: pol, Predicted: ukr
|
| 111 |
+
Key: pol_20110512-0900-PLENARY-3-pl_20110512-11:04:51_2, Target: pol, Predicted: bel
|
| 112 |
+
Key: pol_20110512-0900-PLENARY-3-pl_20110512-11:04:51_3, Target: pol, Predicted: bel
|
| 113 |
+
Key: pol_20110705-0900-PLENARY-5-pl_20110705-12:17:25_0, Target: pol, Predicted: ita
|
| 114 |
+
Key: pol_20110706-0900-PLENARY-4-pl_20110706-13:12:02_0, Target: pol, Predicted: ita
|
| 115 |
+
Key: pol_20110915-0900-PLENARY-3-pl_20110915-09:28:56_3, Target: pol, Predicted: ukr
|
| 116 |
+
Key: pol_20111116-0900-PLENARY-9-pl_20111116-17:01:22_0, Target: pol, Predicted: eng
|
| 117 |
+
Key: pol_20111026-0900-PLENARY-15-pl_20111026-18:51:55_8, Target: pol, Predicted: ces
|
| 118 |
+
Key: pol_20111213-0900-PLENARY-10-pl_20111213-16:58:12_1, Target: pol, Predicted: bel
|
| 119 |
+
Key: pol_20111213-0900-PLENARY-5-pl_20111213-11:07:05_3, Target: pol, Predicted: ukr
|
| 120 |
+
Key: pol_20111213-0900-PLENARY-5-pl_20111213-11:07:05_4, Target: pol, Predicted: ukr
|
| 121 |
+
Key: pol_20120201-0900-PLENARY-13-pl_20120201-20:56:53_1, Target: pol, Predicted: bel
|
| 122 |
+
Key: pol_20120201-0900-PLENARY-13-pl_20120201-20:56:53_2, Target: pol, Predicted: bel
|
| 123 |
+
Key: pol_20120201-0900-PLENARY-13-pl_20120201-20:56:53_3, Target: pol, Predicted: bel
|
| 124 |
+
Key: pol_20120201-0900-PLENARY-13-pl_20120201-20:56:53_5, Target: pol, Predicted: bel
|
| 125 |
+
Key: pol_20120201-0900-PLENARY-13-pl_20120201-20:56:53_6, Target: pol, Predicted: ukr
|
| 126 |
+
Key: pol_20120611-0900-PLENARY-13-pl_20120611-17:23:16_0, Target: pol, Predicted: aze
|
| 127 |
+
Key: pol_20120614-0900-PLENARY-5-pl_20120614-11:12:48_3, Target: pol, Predicted: ukr
|
| 128 |
+
Key: pol_20131008-0900-PLENARY-3-pl_20131008-10:35:17_0, Target: pol, Predicted: nld
|
| 129 |
+
Key: pol_20131021-0900-PLENARY-12-pl_20131021-20:34:35_0, Target: pol, Predicted: eng
|
| 130 |
+
Key: pol_20151125-0900-PLENARY-16-pl_20151125-16:59:57_0, Target: pol, Predicted: hun
|
| 131 |
+
Key: pol_20170313-0900-PLENARY-11-pl_20170313-19:41:51_0, Target: pol, Predicted: deu
|
| 132 |
+
Key: pol_20180115-0900-PLENARY-11-pl_20180115-19:09:08_0, Target: pol, Predicted: ell
|
| 133 |
+
Key: pol_20180207-0900-PLENARY-9-pl_20180207-13:13:22_0, Target: pol, Predicted: deu
|
| 134 |
+
Key: pol_20171115-0900-PLENARY-4-pl_20171115-09:24:40_56, Target: pol, Predicted: fra
|
| 135 |
+
Key: pol_20180313-0900-PLENARY-18-pl_20180313-21:30:24_0, Target: pol, Predicted: spa
|
| 136 |
+
Key: pol_20180704-0900-PLENARY-pl_20180704-11:34:24_0, Target: pol, Predicted: sna
|
| 137 |
+
Key: ron_20090309-0900-PLENARY-14-ro_20090309-21:35:00_0, Target: ron, Predicted: deu
|
| 138 |
+
Key: ron_20090310-0900-PLENARY-19-ro_20090310-21:35:23_0, Target: ron, Predicted: eng
|
| 139 |
+
Key: ron_20130312-0900-PLENARY-5-ro_20130312-10:44:27_4, Target: ron, Predicted: slv
|
| 140 |
+
Key: ron_20140416-0900-PLENARY-4-ro_20140416-11:26:09_0, Target: ron, Predicted: hun
|
| 141 |
+
Key: ron_20180207-0900-PLENARY-17-ro_20180207-18:27:19_0, Target: ron, Predicted: ell
|
| 142 |
+
Key: ron_20180207-0900-PLENARY-17-ro_20180207-17:49:18_13, Target: ron, Predicted: ell
|
| 143 |
+
Key: ron_20180613-0900-PLENARY-6-ro_20180613-12:38:55_0, Target: ron, Predicted: ita
|
| 144 |
+
Key: slk_20090310-0900-PLENARY-9-sk_20090310-13:41:29_0, Target: slk, Predicted: eng
|
| 145 |
+
Key: slk_20091124-0900-PLENARY-19-sk_20091124-23:19:02_13, Target: slk, Predicted: ces
|
| 146 |
+
Key: slk_20091124-0900-PLENARY-19-sk_20091124-23:19:02_14, Target: slk, Predicted: ces
|
| 147 |
+
Key: slk_20130312-0900-PLENARY-11-sk_20130312-14:03:05_5, Target: slk, Predicted: ces
|
| 148 |
+
Key: slk_20091124-0900-PLENARY-19-sk_20091124-23:19:02_2, Target: slk, Predicted: ces
|
| 149 |
+
Key: slk_20090421-0900-PLENARY-23-sk_20090421-23:31:18_16, Target: slk, Predicted: ces
|
| 150 |
+
Key: slk_20091124-0900-PLENARY-19-sk_20091124-23:19:02_9, Target: slk, Predicted: ces
|
| 151 |
+
Key: slk_20131210-0900-PLENARY-11-sk_20131210-14:35:10_1, Target: slk, Predicted: ces
|
| 152 |
+
Key: slk_20131120-0900-PLENARY-12-sk_20131120-14:39:04_0, Target: slk, Predicted: ita
|
| 153 |
+
Key: slk_20150908-0900-PLENARY-12-sk_20150908-16:31:31_0, Target: slk, Predicted: nno
|
| 154 |
+
Key: slk_20150211-0900-PLENARY-10-sk_20150211-16:15:34_12, Target: slk, Predicted: ces
|
| 155 |
+
Key: slk_20151124-0900-PLENARY-13-sk_20151124-20:29:15_5, Target: slk, Predicted: ces
|
| 156 |
+
Key: slk_20180312-0900-PLENARY-20-sk_20180312-22:35:26_1, Target: slk, Predicted: ces
|
| 157 |
+
Key: slk_20180612-0900-PLENARY-8-sk_20180612-13:22:33_3, Target: slk, Predicted: ces
|
| 158 |
+
Key: slk_20180612-0900-PLENARY-sk_20180612-13:22:33_4, Target: slk, Predicted: ces
|
| 159 |
+
Key: slk_20201021-0900-PLENARY-sk_20201021-16:00:55_13, Target: slk, Predicted: ces
|
| 160 |
+
Key: slv_20171114-0900-PLENARY-14-sl_20171114-16:22:58_11, Target: slv, Predicted: hrv
|
| 161 |
+
Key: slv_20170704-0900-PLENARY-22-sl_20170704-23:03:46_0, Target: slv, Predicted: deu
|
| 162 |
+
Key: spa_20090203-0900-PLENARY-14-es_20090203-22:21:14_10, Target: spa, Predicted: ron
|
| 163 |
+
Key: spa_20090505-0900-PLENARY-3-es_20090505-09:56:00_7, Target: spa, Predicted: slv
|
| 164 |
+
Key: spa_20091215-0900-PLENARY-14-es_20091215-22:05:17_2, Target: spa, Predicted: ita
|
| 165 |
+
Key: spa_20100120-0900-PLENARY-5-es_20100120-12:42:38_2, Target: spa, Predicted: lit
|
| 166 |
+
Key: spa_20100615-0900-PLENARY-14-es_20100615-21:30:17_28, Target: spa, Predicted: ita
|
| 167 |
+
Key: spa_20140114-0900-PLENARY-6-es_20140114-13:41:54_0, Target: spa, Predicted: ita
|
| 168 |
+
Key: spa_20140114-0900-PLENARY-6-es_20140114-13:41:54_2, Target: spa, Predicted: ita
|
| 169 |
+
Key: spa_20141126-0900-PLENARY-13-es_20141126-16:38:25_2, Target: spa, Predicted: eng
|
| 170 |
+
Key: spa_20141126-0900-PLENARY-13-es_20141126-16:38:25_3, Target: spa, Predicted: eng
|
| 171 |
+
Key: spa_20141126-0900-PLENARY-13-es_20141126-16:38:25_4, Target: spa, Predicted: eng
|
| 172 |
+
Key: spa_20151007-0900-PLENARY-6-es_20151007-12:04:24_3, Target: spa, Predicted: fra
|
| 173 |
+
Key: spa_20151007-0900-PLENARY-6-es_20151007-12:04:24_5, Target: spa, Predicted: deu
|
| 174 |
+
Key: spa_20151007-0900-PLENARY-7-es_20151007-12:04:24_3, Target: spa, Predicted: deu
|
| 175 |
+
Key: spa_20151007-0900-PLENARY-7-es_20151007-12:04:24_5, Target: spa, Predicted: deu
|
| 176 |
+
Key: spa_20170216-0900-PLENARY-3-es_20170216-09:41:17_6, Target: spa, Predicted: ces
|
| 177 |
+
Key: spa_20170216-0900-PLENARY-5-es_20170216-10:42:11_1, Target: spa, Predicted: ces
|
| 178 |
+
Key: spa_20170404-0900-PLENARY-18-es_20170404-18:36:59_2, Target: spa, Predicted: deu
|
| 179 |
+
Key: spa_20170704-0900-PLENARY-21-es_20170704-21:50:24_2, Target: spa, Predicted: ita
|
| 180 |
+
Key: spa_20170704-0900-PLENARY-21-es_20170704-21:50:24_3, Target: spa, Predicted: ita
|
| 181 |
+
Key: spa_20170704-0900-PLENARY-21-es_20170704-21:50:24_4, Target: spa, Predicted: ita
|
| 182 |
+
Key: spa_20170704-0900-PLENARY-21-es_20170704-21:50:24_5, Target: spa, Predicted: ita
|
| 183 |
+
Key: spa_20171004-0900-PLENARY-3-es_20171004-10:28:48_0, Target: spa, Predicted: deu
|
| 184 |
+
Key: spa_20180529-0900-PLENARY-18-es_20180529-16:25:55_8, Target: spa, Predicted: ces
|
| 185 |
+
Key: spa_20180912-0900-PLENARY-widetrim-es_20180912-22:22:18_2, Target: spa, Predicted: deu
|
| 186 |
+
Key: spa_20180912-0900-PLENARY-widetrim-es_20180912-22:22:18_3, Target: spa, Predicted: deu
|
| 187 |
+
Key: spa_20180912-0900-PLENARY-widetrim-es_20180912-22:22:18_4, Target: spa, Predicted: deu
|
| 188 |
+
Key: spa_20180612-0900-PLENARY-14-es_20180612-17:34:51_0, Target: spa, Predicted: ita
|
| 189 |
+
Key: spa_20180912-0900-PLENARY-widetrim-es_20180912-22:22:18_5, Target: spa, Predicted: deu
|
| 190 |
+
Key: spa_20180612-0900-PLENARY-14-es_20180612-17:34:51_3, Target: spa, Predicted: ita
|
| 191 |
+
Key: spa_20180612-0900-PLENARY-14-es_20180612-17:34:51_4, Target: spa, Predicted: ita
|
| 192 |
+
Key: spa_20180612-0900-PLENARY-14-es_20180612-17:34:51_5, Target: spa, Predicted: ita
|
| 193 |
+
Key: spa_20180612-0900-PLENARY-es_20180612-17:34:51_0, Target: spa, Predicted: ita
|
| 194 |
+
Key: spa_20180612-0900-PLENARY-es_20180612-17:34:51_3, Target: spa, Predicted: ita
|
| 195 |
+
Key: spa_20180612-0900-PLENARY-es_20180612-17:34:51_4, Target: spa, Predicted: ita
|
| 196 |
+
Key: spa_20180612-0900-PLENARY-es_20180612-17:34:51_5, Target: spa, Predicted: ita
|
| 197 |
+
Key: spa_20180912-0900-PLENARY-widetrim-es_20180912-22:22:18_1, Target: spa, Predicted: deu
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/lid_inference_test.log
ADDED
|
@@ -0,0 +1,300 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_inference_dist --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 02:37:15 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_inference_dist.py --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang --dtype float32 --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/wav.scp,speech,sound --data_path_and_name_and_type dump/raw/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/utt2spk,lid_labels,text --valid_batch_size 4 --lid_train_config exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml --lid_model_file exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/valid.accuracy.best.pth --use_preprocessor true --fix_duration false --num_workers 32 --extract_embd false --save_every 1000 --resume true --save_embd_per_utt true --save_embd_avg_lang true --save_tsne_plot false --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
[gpue04] 2025-06-02 02:37:35,038 (abs_task:2406) INFO: config file: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 6 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 7 |
+
torchaudio.set_audio_backend("sox_io")
|
| 8 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/tasks/abs_task.py:2429: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 9 |
+
torch.load(model_file, map_location=device),
|
| 10 |
+
[gpue04] 2025-06-02 02:37:46,607 (lid_inference_dist:86) INFO: Model structure:
|
| 11 |
+
ESPnetLIDUpstreamConditionModel(
|
| 12 |
+
(frontend): S3prlFrontendCondition(
|
| 13 |
+
(upstream): S3PRLUpstreamCondition(
|
| 14 |
+
(upstream): UpstreamExpertCondition(
|
| 15 |
+
(model): Wav2Vec2ModelCondition(
|
| 16 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 17 |
+
(conv_layers): ModuleList(
|
| 18 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 19 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 20 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 21 |
+
(activation): GELUActivation()
|
| 22 |
+
)
|
| 23 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 24 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 25 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 26 |
+
(activation): GELUActivation()
|
| 27 |
+
)
|
| 28 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 29 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 30 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 31 |
+
(activation): GELUActivation()
|
| 32 |
+
)
|
| 33 |
+
)
|
| 34 |
+
)
|
| 35 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 36 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 37 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 38 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 39 |
+
)
|
| 40 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 41 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 42 |
+
(conv): ParametrizedConv1d(
|
| 43 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 44 |
+
(parametrizations): ModuleDict(
|
| 45 |
+
(weight): ParametrizationList(
|
| 46 |
+
(0): _WeightNorm()
|
| 47 |
+
)
|
| 48 |
+
)
|
| 49 |
+
)
|
| 50 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 51 |
+
(activation): GELUActivation()
|
| 52 |
+
)
|
| 53 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 54 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 55 |
+
(layers): ModuleList(
|
| 56 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 57 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 58 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 61 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 62 |
+
)
|
| 63 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 64 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 65 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 66 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 67 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 68 |
+
(intermediate_act_fn): GELUActivation()
|
| 69 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 70 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 71 |
+
)
|
| 72 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 73 |
+
)
|
| 74 |
+
)
|
| 75 |
+
(ecapa_encoder): ModuleDict(
|
| 76 |
+
(32): IdentityEncoder()
|
| 77 |
+
(36): IdentityEncoder()
|
| 78 |
+
(40): IdentityEncoder()
|
| 79 |
+
(44): IdentityEncoder()
|
| 80 |
+
)
|
| 81 |
+
(pooling): ModuleDict(
|
| 82 |
+
(32): ChnAttnStatPooling(
|
| 83 |
+
(attention): Sequential(
|
| 84 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 85 |
+
(1): ReLU()
|
| 86 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 87 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 88 |
+
)
|
| 89 |
+
(softmax): Softmax(dim=2)
|
| 90 |
+
)
|
| 91 |
+
(36): ChnAttnStatPooling(
|
| 92 |
+
(attention): Sequential(
|
| 93 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 94 |
+
(1): ReLU()
|
| 95 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 96 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 97 |
+
)
|
| 98 |
+
(softmax): Softmax(dim=2)
|
| 99 |
+
)
|
| 100 |
+
(40): ChnAttnStatPooling(
|
| 101 |
+
(attention): Sequential(
|
| 102 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 103 |
+
(1): ReLU()
|
| 104 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 105 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 106 |
+
)
|
| 107 |
+
(softmax): Softmax(dim=2)
|
| 108 |
+
)
|
| 109 |
+
(44): ChnAttnStatPooling(
|
| 110 |
+
(attention): Sequential(
|
| 111 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 112 |
+
(1): ReLU()
|
| 113 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 114 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 115 |
+
)
|
| 116 |
+
(softmax): Softmax(dim=2)
|
| 117 |
+
)
|
| 118 |
+
)
|
| 119 |
+
(projector): ModuleDict(
|
| 120 |
+
(32): RawNet3Projector(
|
| 121 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 122 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 123 |
+
)
|
| 124 |
+
(36): RawNet3Projector(
|
| 125 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 126 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 127 |
+
)
|
| 128 |
+
(40): RawNet3Projector(
|
| 129 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 130 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 131 |
+
)
|
| 132 |
+
(44): RawNet3Projector(
|
| 133 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 134 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 135 |
+
)
|
| 136 |
+
)
|
| 137 |
+
(lang2vec_head): ModuleDict(
|
| 138 |
+
(32): Sequential(
|
| 139 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 140 |
+
)
|
| 141 |
+
(36): Sequential(
|
| 142 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 143 |
+
)
|
| 144 |
+
(40): Sequential(
|
| 145 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 146 |
+
)
|
| 147 |
+
(44): Sequential(
|
| 148 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 149 |
+
)
|
| 150 |
+
)
|
| 151 |
+
(aamsoftmax_weight): ParameterDict()
|
| 152 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 153 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 154 |
+
(ce): CrossEntropyLoss()
|
| 155 |
+
(lang2vec_head): Sequential(
|
| 156 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 157 |
+
)
|
| 158 |
+
(lang2vec_loss): MSELoss()
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
)
|
| 163 |
+
)
|
| 164 |
+
(featurizer): Featurizer()
|
| 165 |
+
)
|
| 166 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 167 |
+
(encoder): EcapaTdnnEncoder(
|
| 168 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 169 |
+
(relu): ReLU()
|
| 170 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 171 |
+
(layer1): EcapaBlock(
|
| 172 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 173 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 174 |
+
(convs): ModuleList(
|
| 175 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 176 |
+
)
|
| 177 |
+
(bns): ModuleList(
|
| 178 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 179 |
+
)
|
| 180 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 181 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 182 |
+
(relu): ReLU()
|
| 183 |
+
(se): SEModule(
|
| 184 |
+
(se): Sequential(
|
| 185 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 186 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 187 |
+
(2): ReLU()
|
| 188 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 189 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 190 |
+
(5): Sigmoid()
|
| 191 |
+
)
|
| 192 |
+
)
|
| 193 |
+
)
|
| 194 |
+
(layer2): EcapaBlock(
|
| 195 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 196 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 197 |
+
(convs): ModuleList(
|
| 198 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 199 |
+
)
|
| 200 |
+
(bns): ModuleList(
|
| 201 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 202 |
+
)
|
| 203 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 204 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 205 |
+
(relu): ReLU()
|
| 206 |
+
(se): SEModule(
|
| 207 |
+
(se): Sequential(
|
| 208 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 209 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 210 |
+
(2): ReLU()
|
| 211 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 212 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 213 |
+
(5): Sigmoid()
|
| 214 |
+
)
|
| 215 |
+
)
|
| 216 |
+
)
|
| 217 |
+
(layer3): EcapaBlock(
|
| 218 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 219 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 220 |
+
(convs): ModuleList(
|
| 221 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 222 |
+
)
|
| 223 |
+
(bns): ModuleList(
|
| 224 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 225 |
+
)
|
| 226 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 227 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 228 |
+
(relu): ReLU()
|
| 229 |
+
(se): SEModule(
|
| 230 |
+
(se): Sequential(
|
| 231 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 232 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 233 |
+
(2): ReLU()
|
| 234 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 235 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 236 |
+
(5): Sigmoid()
|
| 237 |
+
)
|
| 238 |
+
)
|
| 239 |
+
)
|
| 240 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 241 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 242 |
+
)
|
| 243 |
+
(pooling): ChnAttnStatPooling(
|
| 244 |
+
(attention): Sequential(
|
| 245 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 246 |
+
(1): ReLU()
|
| 247 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 248 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 249 |
+
)
|
| 250 |
+
(softmax): Softmax(dim=2)
|
| 251 |
+
)
|
| 252 |
+
(projector): RawNet3Projector(
|
| 253 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 254 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 255 |
+
)
|
| 256 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 257 |
+
(ce): CrossEntropyLoss()
|
| 258 |
+
(lang2vec_head): Sequential(
|
| 259 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 260 |
+
)
|
| 261 |
+
(lang2vec_loss): MSELoss()
|
| 262 |
+
)
|
| 263 |
+
)
|
| 264 |
+
|
| 265 |
+
Model summary:
|
| 266 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 267 |
+
Total Number of model parameters: 977.14 M
|
| 268 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 269 |
+
Size: 3.91 GB
|
| 270 |
+
Type: torch.float32
|
| 271 |
+
/u/qwang20/miniconda3/envs/espnet2/lib/python3.11/site-packages/torch/utils/data/dataloader.py:557: UserWarning: This DataLoader will create 32 worker processes in total. Our suggested max number of worker in current system is 16, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
|
| 272 |
+
warnings.warn(_create_warning_msg(
|
| 273 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/reporter.py:321: UserWarning: The stats of the previous epoch=-1doesn't exist.
|
| 274 |
+
warnings.warn(
|
| 275 |
+
[gpue04] 2025-06-02 02:37:47,156 (lid_trainer:102) INFO: [Rank 0] Resume: 0 utterances found in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/inference/valid.accuracy.best/dev_babel_over_10s_lang_cross_train_all_no_filter_lang/lids0
|
| 276 |
+
[gpue04] 2025-06-02 02:38:41,828 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 0
|
| 277 |
+
[gpue04] 2025-06-02 02:39:27,483 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 1
|
| 278 |
+
[gpue04] 2025-06-02 02:40:15,909 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 2
|
| 279 |
+
[gpue04] 2025-06-02 02:41:08,571 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 3
|
| 280 |
+
[gpue04] 2025-06-02 02:41:56,182 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 4
|
| 281 |
+
[gpue04] 2025-06-02 02:42:40,736 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 5
|
| 282 |
+
[gpue04] 2025-06-02 02:43:27,814 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 6
|
| 283 |
+
[gpue04] 2025-06-02 02:44:10,740 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 7
|
| 284 |
+
[gpue04] 2025-06-02 02:44:52,065 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 8
|
| 285 |
+
[gpue04] 2025-06-02 02:45:40,635 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 9
|
| 286 |
+
[gpue04] 2025-06-02 02:46:28,394 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 10
|
| 287 |
+
[gpue04] 2025-06-02 02:47:09,502 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 11
|
| 288 |
+
[gpue04] 2025-06-02 02:47:59,978 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 12
|
| 289 |
+
[gpue04] 2025-06-02 02:48:52,866 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 13
|
| 290 |
+
[gpue04] 2025-06-02 02:49:41,279 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 14
|
| 291 |
+
[gpue04] 2025-06-02 02:50:32,817 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 15
|
| 292 |
+
[gpue04] 2025-06-02 02:51:20,444 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 16
|
| 293 |
+
[gpue04] 2025-06-02 02:52:09,714 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 17
|
| 294 |
+
[gpue04] 2025-06-02 02:52:55,108 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 18
|
| 295 |
+
[gpue04] 2025-06-02 02:53:50,212 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 19
|
| 296 |
+
[gpue04] 2025-06-02 02:54:31,533 (lid_trainer:207) INFO: [Rank 0] Saved 1000 utts at step 20
|
| 297 |
+
[gpue04] 2025-06-02 02:55:19,223 (lid_inference_dist:200) INFO: args.save_embd_per_utt: True
|
| 298 |
+
[gpue04] 2025-06-02 02:55:19,224 (lid_inference_dist:215) INFO: args.save_tsne_plot: False
|
| 299 |
+
# Accounting: time=1085 threads=1
|
| 300 |
+
# Ended (code 0) at Mon Jun 2 02:55:20 CDT 2025, elapsed time 1085 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.1.log
ADDED
|
@@ -0,0 +1,390 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_train --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Wed Jun 4 20:24:52 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_train.py --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 6 |
+
torchaudio.set_audio_backend("sox_io")
|
| 7 |
+
[gpue08] 2025-06-04 20:25:25,391 (abs_task:1420) INFO: pytorch.version=2.4.0+cu118, cuda.available=True, cudnn.version=90100, cudnn.benchmark=True, cudnn.deterministic=False
|
| 8 |
+
[gpue08] 2025-06-04 20:25:25,398 (abs_task:1421) INFO: Model structure:
|
| 9 |
+
ESPnetLIDUpstreamConditionModel(
|
| 10 |
+
(frontend): S3prlFrontendCondition(
|
| 11 |
+
(upstream): S3PRLUpstreamCondition(
|
| 12 |
+
(upstream): UpstreamExpertCondition(
|
| 13 |
+
(model): Wav2Vec2ModelCondition(
|
| 14 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 15 |
+
(conv_layers): ModuleList(
|
| 16 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 17 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 18 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 19 |
+
(activation): GELUActivation()
|
| 20 |
+
)
|
| 21 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 22 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 23 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 24 |
+
(activation): GELUActivation()
|
| 25 |
+
)
|
| 26 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 27 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 28 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 29 |
+
(activation): GELUActivation()
|
| 30 |
+
)
|
| 31 |
+
)
|
| 32 |
+
)
|
| 33 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 34 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 35 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 36 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 37 |
+
)
|
| 38 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 39 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 40 |
+
(conv): ParametrizedConv1d(
|
| 41 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 42 |
+
(parametrizations): ModuleDict(
|
| 43 |
+
(weight): ParametrizationList(
|
| 44 |
+
(0): _WeightNorm()
|
| 45 |
+
)
|
| 46 |
+
)
|
| 47 |
+
)
|
| 48 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 49 |
+
(activation): GELUActivation()
|
| 50 |
+
)
|
| 51 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 52 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 53 |
+
(layers): ModuleList(
|
| 54 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 55 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 56 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 57 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 58 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
)
|
| 61 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 62 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 63 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 64 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 65 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 66 |
+
(intermediate_act_fn): GELUActivation()
|
| 67 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 68 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 69 |
+
)
|
| 70 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 71 |
+
)
|
| 72 |
+
)
|
| 73 |
+
(ecapa_encoder): ModuleDict(
|
| 74 |
+
(32): IdentityEncoder()
|
| 75 |
+
(36): IdentityEncoder()
|
| 76 |
+
(40): IdentityEncoder()
|
| 77 |
+
(44): IdentityEncoder()
|
| 78 |
+
)
|
| 79 |
+
(pooling): ModuleDict(
|
| 80 |
+
(32): ChnAttnStatPooling(
|
| 81 |
+
(attention): Sequential(
|
| 82 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 83 |
+
(1): ReLU()
|
| 84 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 85 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 86 |
+
)
|
| 87 |
+
(softmax): Softmax(dim=2)
|
| 88 |
+
)
|
| 89 |
+
(36): ChnAttnStatPooling(
|
| 90 |
+
(attention): Sequential(
|
| 91 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 92 |
+
(1): ReLU()
|
| 93 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 94 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 95 |
+
)
|
| 96 |
+
(softmax): Softmax(dim=2)
|
| 97 |
+
)
|
| 98 |
+
(40): ChnAttnStatPooling(
|
| 99 |
+
(attention): Sequential(
|
| 100 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 101 |
+
(1): ReLU()
|
| 102 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 103 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 104 |
+
)
|
| 105 |
+
(softmax): Softmax(dim=2)
|
| 106 |
+
)
|
| 107 |
+
(44): ChnAttnStatPooling(
|
| 108 |
+
(attention): Sequential(
|
| 109 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 110 |
+
(1): ReLU()
|
| 111 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 112 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 113 |
+
)
|
| 114 |
+
(softmax): Softmax(dim=2)
|
| 115 |
+
)
|
| 116 |
+
)
|
| 117 |
+
(projector): ModuleDict(
|
| 118 |
+
(32): RawNet3Projector(
|
| 119 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 120 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 121 |
+
)
|
| 122 |
+
(36): RawNet3Projector(
|
| 123 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 124 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 125 |
+
)
|
| 126 |
+
(40): RawNet3Projector(
|
| 127 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 128 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 129 |
+
)
|
| 130 |
+
(44): RawNet3Projector(
|
| 131 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 132 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 133 |
+
)
|
| 134 |
+
)
|
| 135 |
+
(lang2vec_head): ModuleDict(
|
| 136 |
+
(32): Sequential(
|
| 137 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 138 |
+
)
|
| 139 |
+
(36): Sequential(
|
| 140 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 141 |
+
)
|
| 142 |
+
(40): Sequential(
|
| 143 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 144 |
+
)
|
| 145 |
+
(44): Sequential(
|
| 146 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 147 |
+
)
|
| 148 |
+
)
|
| 149 |
+
(aamsoftmax_weight): ParameterDict()
|
| 150 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 151 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 152 |
+
(ce): CrossEntropyLoss()
|
| 153 |
+
(lang2vec_head): Sequential(
|
| 154 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 155 |
+
)
|
| 156 |
+
(lang2vec_loss): MSELoss()
|
| 157 |
+
)
|
| 158 |
+
)
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
(featurizer): Featurizer()
|
| 163 |
+
)
|
| 164 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 165 |
+
(encoder): EcapaTdnnEncoder(
|
| 166 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 167 |
+
(relu): ReLU()
|
| 168 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 169 |
+
(layer1): EcapaBlock(
|
| 170 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 171 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 172 |
+
(convs): ModuleList(
|
| 173 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 174 |
+
)
|
| 175 |
+
(bns): ModuleList(
|
| 176 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 177 |
+
)
|
| 178 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 179 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 180 |
+
(relu): ReLU()
|
| 181 |
+
(se): SEModule(
|
| 182 |
+
(se): Sequential(
|
| 183 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 184 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 185 |
+
(2): ReLU()
|
| 186 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 187 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 188 |
+
(5): Sigmoid()
|
| 189 |
+
)
|
| 190 |
+
)
|
| 191 |
+
)
|
| 192 |
+
(layer2): EcapaBlock(
|
| 193 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 194 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 195 |
+
(convs): ModuleList(
|
| 196 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 197 |
+
)
|
| 198 |
+
(bns): ModuleList(
|
| 199 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 200 |
+
)
|
| 201 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 202 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 203 |
+
(relu): ReLU()
|
| 204 |
+
(se): SEModule(
|
| 205 |
+
(se): Sequential(
|
| 206 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 207 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 208 |
+
(2): ReLU()
|
| 209 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 210 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 211 |
+
(5): Sigmoid()
|
| 212 |
+
)
|
| 213 |
+
)
|
| 214 |
+
)
|
| 215 |
+
(layer3): EcapaBlock(
|
| 216 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 217 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 218 |
+
(convs): ModuleList(
|
| 219 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 220 |
+
)
|
| 221 |
+
(bns): ModuleList(
|
| 222 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 223 |
+
)
|
| 224 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 225 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 226 |
+
(relu): ReLU()
|
| 227 |
+
(se): SEModule(
|
| 228 |
+
(se): Sequential(
|
| 229 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 230 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 231 |
+
(2): ReLU()
|
| 232 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 233 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 234 |
+
(5): Sigmoid()
|
| 235 |
+
)
|
| 236 |
+
)
|
| 237 |
+
)
|
| 238 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 239 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 240 |
+
)
|
| 241 |
+
(pooling): ChnAttnStatPooling(
|
| 242 |
+
(attention): Sequential(
|
| 243 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 244 |
+
(1): ReLU()
|
| 245 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 246 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 247 |
+
)
|
| 248 |
+
(softmax): Softmax(dim=2)
|
| 249 |
+
)
|
| 250 |
+
(projector): RawNet3Projector(
|
| 251 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 252 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 253 |
+
)
|
| 254 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 255 |
+
(ce): CrossEntropyLoss()
|
| 256 |
+
(lang2vec_head): Sequential(
|
| 257 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 258 |
+
)
|
| 259 |
+
(lang2vec_loss): MSELoss()
|
| 260 |
+
)
|
| 261 |
+
)
|
| 262 |
+
|
| 263 |
+
Model summary:
|
| 264 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 265 |
+
Total Number of model parameters: 977.14 M
|
| 266 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 267 |
+
Size: 3.91 GB
|
| 268 |
+
Type: torch.float32
|
| 269 |
+
[gpue08] 2025-06-04 20:25:25,398 (abs_task:1424) INFO: Optimizer:
|
| 270 |
+
Adam (
|
| 271 |
+
Parameter Group 0
|
| 272 |
+
amsgrad: False
|
| 273 |
+
betas: [0.9, 0.98]
|
| 274 |
+
capturable: False
|
| 275 |
+
differentiable: False
|
| 276 |
+
eps: 1e-08
|
| 277 |
+
foreach: None
|
| 278 |
+
fused: None
|
| 279 |
+
initial_lr: 1e-05
|
| 280 |
+
lr: 6.0032e-06
|
| 281 |
+
maximize: False
|
| 282 |
+
weight_decay: 0
|
| 283 |
+
)
|
| 284 |
+
[gpue08] 2025-06-04 20:25:25,398 (abs_task:1425) INFO: Scheduler: TristageLR(warmup_steps=1250)(hold_steps=5000)(decay_steps=6250)(init_lr_scale=0.6)(final_lr_scale=0.1)(decay_factor=0.00036841361487904725)
|
| 285 |
+
[gpue08] 2025-06-04 20:25:25,404 (abs_task:1434) INFO: Saving the configuration in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/config.yaml
|
| 286 |
+
[gpue08] 2025-06-04 20:25:25,693 (preprocessor:2245) INFO: Using lang2vec geo
|
| 287 |
+
# Accounting: time=218 threads=1
|
| 288 |
+
# Ended (code 0) at Wed Jun 4 20:25:32 CDT 2025, elapsed time 218 seconds
|
| 289 |
+
[gpue08] 2025-06-04 20:25:41,611 (abs_task:1899) WARNING: Reading dump/raw/train_all_no_filter_lang/category2utt
|
| 290 |
+
[gpue08] 2025-06-04 20:25:41,660 (abs_task:1946) WARNING: Reading dump/raw/train_all_no_filter_lang/dataset2utt
|
| 291 |
+
[gpue08] 2025-06-04 20:25:41,663 (abs_task:1962) WARNING: Reading dump/raw/train_all_no_filter_lang/utt2dataset
|
| 292 |
+
[gpue08] 2025-06-04 20:27:58,237 (abs_task:1997) INFO: [train] dataset:
|
| 293 |
+
ESPnetDataset(
|
| 294 |
+
speech: {"path": "dump/raw/train_all_no_filter_lang/wav.scp", "type": "sound"}
|
| 295 |
+
lid_labels: {"path": "dump/raw/train_all_no_filter_lang/utt2spk", "type": "text"}
|
| 296 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 297 |
+
[gpue08] 2025-06-04 20:27:58,256 (abs_task:1998) INFO: [train] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 298 |
+
[gpue08] 2025-06-04 20:27:58,256 (abs_task:1999) INFO: [train] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 299 |
+
[gpue08] 2025-06-04 20:27:58,256 (abs_task:2000) INFO: [train] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=727476, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 300 |
+
[gpue08] 2025-06-04 20:27:58,323 (abs_task:2001) INFO: [train] mini-batch sizes summary: N-batch=727476, mean=6.0, min=1, max=6
|
| 301 |
+
[gpue08] 2025-06-04 20:27:58,742 (preprocessor:2245) INFO: Using lang2vec geo
|
| 302 |
+
[gpue08] 2025-06-04 20:28:11,299 (abs_task:1899) WARNING: Reading dump/raw/dev_ml_superb2_lang/category2utt
|
| 303 |
+
[gpue08] 2025-06-04 20:28:11,301 (abs_task:1946) WARNING: Reading dump/raw/dev_ml_superb2_lang/dataset2utt
|
| 304 |
+
[gpue08] 2025-06-04 20:28:11,302 (abs_task:1962) WARNING: Reading dump/raw/dev_ml_superb2_lang/utt2dataset
|
| 305 |
+
[gpue08] 2025-06-04 20:28:12,337 (abs_task:1997) INFO: [valid] dataset:
|
| 306 |
+
ESPnetDataset(
|
| 307 |
+
speech: {"path": "dump/raw/dev_ml_superb2_lang/wav.scp", "type": "sound"}
|
| 308 |
+
lid_labels: {"path": "dump/raw/dev_ml_superb2_lang/utt2spk", "type": "text"}
|
| 309 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 310 |
+
[gpue08] 2025-06-04 20:28:12,337 (abs_task:1998) INFO: [valid] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 311 |
+
[gpue08] 2025-06-04 20:28:12,338 (abs_task:1999) INFO: [valid] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 312 |
+
[gpue08] 2025-06-04 20:28:12,338 (abs_task:2000) INFO: [valid] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=4722, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 313 |
+
[gpue08] 2025-06-04 20:28:12,338 (abs_task:2001) INFO: [valid] mini-batch sizes summary: N-batch=4722, mean=6.0, min=4, max=6
|
| 314 |
+
wandb: Currently logged in as: qingzhew (qingzhew-carnegie-mellon-university) to https://api.wandb.ai. Use `wandb login --relogin` to force relogin
|
| 315 |
+
wandb: Tracking run with wandb version 0.19.10
|
| 316 |
+
wandb: Run data is saved locally in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/wandb/run-20250604_202812-6dkg2ayp
|
| 317 |
+
wandb: Run `wandb offline` to turn off syncing.
|
| 318 |
+
wandb: Syncing run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch
|
| 319 |
+
wandb: ⭐️ View project at https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 320 |
+
wandb: 🚀 View run at https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/6dkg2ayp
|
| 321 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:218: FutureWarning: `torch.cuda.amp.GradScaler(args...)` is deprecated. Please use `torch.amp.GradScaler('cuda', args...)` instead.
|
| 322 |
+
scaler = GradScaler()
|
| 323 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:159: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 324 |
+
states = torch.load(
|
| 325 |
+
[gpue08] 2025-06-04 20:28:22,171 (trainer:176) INFO: The training was resumed using exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/checkpoint.pth
|
| 326 |
+
[gpue08] 2025-06-04 20:28:22,239 (trainer:251) INFO: Frontend featurizer weights for each layer:
|
| 327 |
+
Parameter containing:
|
| 328 |
+
tensor([-0.0056, -0.0141, -0.0168, -0.0187, -0.0203, -0.0225, -0.0231, -0.0246,
|
| 329 |
+
-0.0253, -0.0252, -0.0254, -0.0241, -0.0226, -0.0200, -0.0162, -0.0120,
|
| 330 |
+
-0.0095, -0.0059, -0.0017, 0.0058, 0.0097, 0.0142, 0.0175, 0.0196,
|
| 331 |
+
0.0211, 0.0224, 0.0228, 0.0230, 0.0226, 0.0224, 0.0215, 0.0210,
|
| 332 |
+
0.0196, 0.0176, 0.0157, 0.0126, 0.0095, 0.0070, 0.0051, 0.0037,
|
| 333 |
+
0.0020, -0.0003, -0.0030, -0.0056, -0.0076, -0.0090, -0.0096, -0.0102,
|
| 334 |
+
-0.0102], device='cuda:0', requires_grad=True)
|
| 335 |
+
[gpue08] 2025-06-04 20:28:22,239 (trainer:267) INFO: Error: 'Linear' object is not subscriptable
|
| 336 |
+
[gpue08] 2025-06-04 20:28:22,240 (trainer:272) INFO: cos_mp: 1.0
|
| 337 |
+
[gpue08] 2025-06-04 20:28:22,240 (trainer:273) INFO: easy_margin: False
|
| 338 |
+
[gpue08] 2025-06-04 20:28:22,240 (trainer:281) WARNING: The training has already reached at max_epoch: 34
|
| 339 |
+
[gpue08] 2025-06-04 20:28:22,253 (trainer:541) INFO: The training was finished at 33 epochs
|
| 340 |
+
[gpue08] 2025-06-04 20:28:22,253 (average_nbest_models:69) INFO: Averaging 2best models: criterion="valid.accuracy": exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/valid.accuracy.ave_2best.pth
|
| 341 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/main_funcs/average_nbest_models.py:77: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 342 |
+
_loaded[e] = torch.load(
|
| 343 |
+
[gpue08] 2025-06-04 20:28:27,695 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.32.attention.2.num_batches_tracked instead of averaging
|
| 344 |
+
[gpue08] 2025-06-04 20:28:27,695 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.36.attention.2.num_batches_tracked instead of averaging
|
| 345 |
+
[gpue08] 2025-06-04 20:28:27,696 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.40.attention.2.num_batches_tracked instead of averaging
|
| 346 |
+
[gpue08] 2025-06-04 20:28:27,697 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.44.attention.2.num_batches_tracked instead of averaging
|
| 347 |
+
[gpue08] 2025-06-04 20:28:27,697 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.32.bn.num_batches_tracked instead of averaging
|
| 348 |
+
[gpue08] 2025-06-04 20:28:27,698 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.36.bn.num_batches_tracked instead of averaging
|
| 349 |
+
[gpue08] 2025-06-04 20:28:27,698 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.40.bn.num_batches_tracked instead of averaging
|
| 350 |
+
[gpue08] 2025-06-04 20:28:27,699 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.44.bn.num_batches_tracked instead of averaging
|
| 351 |
+
[gpue08] 2025-06-04 20:28:27,701 (average_nbest_models:96) INFO: Accumulating encoder.bn.num_batches_tracked instead of averaging
|
| 352 |
+
[gpue08] 2025-06-04 20:28:27,701 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bn1.num_batches_tracked instead of averaging
|
| 353 |
+
[gpue08] 2025-06-04 20:28:27,701 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.0.num_batches_tracked instead of averaging
|
| 354 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.1.num_batches_tracked instead of averaging
|
| 355 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.2.num_batches_tracked instead of averaging
|
| 356 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.3.num_batches_tracked instead of averaging
|
| 357 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.4.num_batches_tracked instead of averaging
|
| 358 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.5.num_batches_tracked instead of averaging
|
| 359 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.6.num_batches_tracked instead of averaging
|
| 360 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bn3.num_batches_tracked instead of averaging
|
| 361 |
+
[gpue08] 2025-06-04 20:28:27,702 (average_nbest_models:96) INFO: Accumulating encoder.layer1.se.se.3.num_batches_tracked instead of averaging
|
| 362 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bn1.num_batches_tracked instead of averaging
|
| 363 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.0.num_batches_tracked instead of averaging
|
| 364 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.1.num_batches_tracked instead of averaging
|
| 365 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.2.num_batches_tracked instead of averaging
|
| 366 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.3.num_batches_tracked instead of averaging
|
| 367 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.4.num_batches_tracked instead of averaging
|
| 368 |
+
[gpue08] 2025-06-04 20:28:27,703 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.5.num_batches_tracked instead of averaging
|
| 369 |
+
[gpue08] 2025-06-04 20:28:27,704 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.6.num_batches_tracked instead of averaging
|
| 370 |
+
[gpue08] 2025-06-04 20:28:27,704 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bn3.num_batches_tracked instead of averaging
|
| 371 |
+
[gpue08] 2025-06-04 20:28:27,704 (average_nbest_models:96) INFO: Accumulating encoder.layer2.se.se.3.num_batches_tracked instead of averaging
|
| 372 |
+
[gpue08] 2025-06-04 20:28:27,704 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bn1.num_batches_tracked instead of averaging
|
| 373 |
+
[gpue08] 2025-06-04 20:28:27,704 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.0.num_batches_tracked instead of averaging
|
| 374 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.1.num_batches_tracked instead of averaging
|
| 375 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.2.num_batches_tracked instead of averaging
|
| 376 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.3.num_batches_tracked instead of averaging
|
| 377 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.4.num_batches_tracked instead of averaging
|
| 378 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.5.num_batches_tracked instead of averaging
|
| 379 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.6.num_batches_tracked instead of averaging
|
| 380 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bn3.num_batches_tracked instead of averaging
|
| 381 |
+
[gpue08] 2025-06-04 20:28:27,705 (average_nbest_models:96) INFO: Accumulating encoder.layer3.se.se.3.num_batches_tracked instead of averaging
|
| 382 |
+
[gpue08] 2025-06-04 20:28:27,707 (average_nbest_models:96) INFO: Accumulating pooling.attention.2.num_batches_tracked instead of averaging
|
| 383 |
+
[gpue08] 2025-06-04 20:28:27,707 (average_nbest_models:96) INFO: Accumulating projector.bn.num_batches_tracked instead of averaging
|
| 384 |
+
wandb:
|
| 385 |
+
wandb: 🚀 View run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch at: https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/6dkg2ayp
|
| 386 |
+
wandb: ⭐️ View project at: https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 387 |
+
wandb: Synced 5 W&B file(s), 0 media file(s), 0 artifact file(s) and 0 other file(s)
|
| 388 |
+
wandb: Find logs at: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/wandb/run-20250604_202812-6dkg2ayp/logs
|
| 389 |
+
# Accounting: time=221 threads=1
|
| 390 |
+
# Ended (code 0) at Wed Jun 4 20:28:33 CDT 2025, elapsed time 221 seconds
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.2.log
ADDED
|
@@ -0,0 +1,441 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_train --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Wed Jun 4 20:21:54 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_train.py --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 6 |
+
torchaudio.set_audio_backend("sox_io")
|
| 7 |
+
[gpue06] 2025-06-04 20:22:27,532 (abs_task:1420) INFO: pytorch.version=2.4.0+cu118, cuda.available=True, cudnn.version=90100, cudnn.benchmark=True, cudnn.deterministic=False
|
| 8 |
+
[gpue06] 2025-06-04 20:22:27,538 (abs_task:1421) INFO: Model structure:
|
| 9 |
+
ESPnetLIDUpstreamConditionModel(
|
| 10 |
+
(frontend): S3prlFrontendCondition(
|
| 11 |
+
(upstream): S3PRLUpstreamCondition(
|
| 12 |
+
(upstream): UpstreamExpertCondition(
|
| 13 |
+
(model): Wav2Vec2ModelCondition(
|
| 14 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 15 |
+
(conv_layers): ModuleList(
|
| 16 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 17 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 18 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 19 |
+
(activation): GELUActivation()
|
| 20 |
+
)
|
| 21 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 22 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 23 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 24 |
+
(activation): GELUActivation()
|
| 25 |
+
)
|
| 26 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 27 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 28 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 29 |
+
(activation): GELUActivation()
|
| 30 |
+
)
|
| 31 |
+
)
|
| 32 |
+
)
|
| 33 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 34 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 35 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 36 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 37 |
+
)
|
| 38 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 39 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 40 |
+
(conv): ParametrizedConv1d(
|
| 41 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 42 |
+
(parametrizations): ModuleDict(
|
| 43 |
+
(weight): ParametrizationList(
|
| 44 |
+
(0): _WeightNorm()
|
| 45 |
+
)
|
| 46 |
+
)
|
| 47 |
+
)
|
| 48 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 49 |
+
(activation): GELUActivation()
|
| 50 |
+
)
|
| 51 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 52 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 53 |
+
(layers): ModuleList(
|
| 54 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 55 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 56 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 57 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 58 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
)
|
| 61 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 62 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 63 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 64 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 65 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 66 |
+
(intermediate_act_fn): GELUActivation()
|
| 67 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 68 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 69 |
+
)
|
| 70 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 71 |
+
)
|
| 72 |
+
)
|
| 73 |
+
(ecapa_encoder): ModuleDict(
|
| 74 |
+
(32): IdentityEncoder()
|
| 75 |
+
(36): IdentityEncoder()
|
| 76 |
+
(40): IdentityEncoder()
|
| 77 |
+
(44): IdentityEncoder()
|
| 78 |
+
)
|
| 79 |
+
(pooling): ModuleDict(
|
| 80 |
+
(32): ChnAttnStatPooling(
|
| 81 |
+
(attention): Sequential(
|
| 82 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 83 |
+
(1): ReLU()
|
| 84 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 85 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 86 |
+
)
|
| 87 |
+
(softmax): Softmax(dim=2)
|
| 88 |
+
)
|
| 89 |
+
(36): ChnAttnStatPooling(
|
| 90 |
+
(attention): Sequential(
|
| 91 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 92 |
+
(1): ReLU()
|
| 93 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 94 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 95 |
+
)
|
| 96 |
+
(softmax): Softmax(dim=2)
|
| 97 |
+
)
|
| 98 |
+
(40): ChnAttnStatPooling(
|
| 99 |
+
(attention): Sequential(
|
| 100 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 101 |
+
(1): ReLU()
|
| 102 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 103 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 104 |
+
)
|
| 105 |
+
(softmax): Softmax(dim=2)
|
| 106 |
+
)
|
| 107 |
+
(44): ChnAttnStatPooling(
|
| 108 |
+
(attention): Sequential(
|
| 109 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 110 |
+
(1): ReLU()
|
| 111 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 112 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 113 |
+
)
|
| 114 |
+
(softmax): Softmax(dim=2)
|
| 115 |
+
)
|
| 116 |
+
)
|
| 117 |
+
(projector): ModuleDict(
|
| 118 |
+
(32): RawNet3Projector(
|
| 119 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 120 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 121 |
+
)
|
| 122 |
+
(36): RawNet3Projector(
|
| 123 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 124 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 125 |
+
)
|
| 126 |
+
(40): RawNet3Projector(
|
| 127 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 128 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 129 |
+
)
|
| 130 |
+
(44): RawNet3Projector(
|
| 131 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 132 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 133 |
+
)
|
| 134 |
+
)
|
| 135 |
+
(lang2vec_head): ModuleDict(
|
| 136 |
+
(32): Sequential(
|
| 137 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 138 |
+
)
|
| 139 |
+
(36): Sequential(
|
| 140 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 141 |
+
)
|
| 142 |
+
(40): Sequential(
|
| 143 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 144 |
+
)
|
| 145 |
+
(44): Sequential(
|
| 146 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 147 |
+
)
|
| 148 |
+
)
|
| 149 |
+
(aamsoftmax_weight): ParameterDict()
|
| 150 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 151 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 152 |
+
(ce): CrossEntropyLoss()
|
| 153 |
+
(lang2vec_head): Sequential(
|
| 154 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 155 |
+
)
|
| 156 |
+
(lang2vec_loss): MSELoss()
|
| 157 |
+
)
|
| 158 |
+
)
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
(featurizer): Featurizer()
|
| 163 |
+
)
|
| 164 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 165 |
+
(encoder): EcapaTdnnEncoder(
|
| 166 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 167 |
+
(relu): ReLU()
|
| 168 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 169 |
+
(layer1): EcapaBlock(
|
| 170 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 171 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 172 |
+
(convs): ModuleList(
|
| 173 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 174 |
+
)
|
| 175 |
+
(bns): ModuleList(
|
| 176 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 177 |
+
)
|
| 178 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 179 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 180 |
+
(relu): ReLU()
|
| 181 |
+
(se): SEModule(
|
| 182 |
+
(se): Sequential(
|
| 183 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 184 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 185 |
+
(2): ReLU()
|
| 186 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 187 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 188 |
+
(5): Sigmoid()
|
| 189 |
+
)
|
| 190 |
+
)
|
| 191 |
+
)
|
| 192 |
+
(layer2): EcapaBlock(
|
| 193 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 194 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 195 |
+
(convs): ModuleList(
|
| 196 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 197 |
+
)
|
| 198 |
+
(bns): ModuleList(
|
| 199 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 200 |
+
)
|
| 201 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 202 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 203 |
+
(relu): ReLU()
|
| 204 |
+
(se): SEModule(
|
| 205 |
+
(se): Sequential(
|
| 206 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 207 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 208 |
+
(2): ReLU()
|
| 209 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 210 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 211 |
+
(5): Sigmoid()
|
| 212 |
+
)
|
| 213 |
+
)
|
| 214 |
+
)
|
| 215 |
+
(layer3): EcapaBlock(
|
| 216 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 217 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 218 |
+
(convs): ModuleList(
|
| 219 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 220 |
+
)
|
| 221 |
+
(bns): ModuleList(
|
| 222 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 223 |
+
)
|
| 224 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 225 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 226 |
+
(relu): ReLU()
|
| 227 |
+
(se): SEModule(
|
| 228 |
+
(se): Sequential(
|
| 229 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 230 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 231 |
+
(2): ReLU()
|
| 232 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 233 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 234 |
+
(5): Sigmoid()
|
| 235 |
+
)
|
| 236 |
+
)
|
| 237 |
+
)
|
| 238 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 239 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 240 |
+
)
|
| 241 |
+
(pooling): ChnAttnStatPooling(
|
| 242 |
+
(attention): Sequential(
|
| 243 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 244 |
+
(1): ReLU()
|
| 245 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 246 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 247 |
+
)
|
| 248 |
+
(softmax): Softmax(dim=2)
|
| 249 |
+
)
|
| 250 |
+
(projector): RawNet3Projector(
|
| 251 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 252 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 253 |
+
)
|
| 254 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 255 |
+
(ce): CrossEntropyLoss()
|
| 256 |
+
(lang2vec_head): Sequential(
|
| 257 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 258 |
+
)
|
| 259 |
+
(lang2vec_loss): MSELoss()
|
| 260 |
+
)
|
| 261 |
+
)
|
| 262 |
+
|
| 263 |
+
Model summary:
|
| 264 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 265 |
+
Total Number of model parameters: 977.14 M
|
| 266 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 267 |
+
Size: 3.91 GB
|
| 268 |
+
Type: torch.float32
|
| 269 |
+
[gpue06] 2025-06-04 20:22:27,538 (abs_task:1424) INFO: Optimizer:
|
| 270 |
+
Adam (
|
| 271 |
+
Parameter Group 0
|
| 272 |
+
amsgrad: False
|
| 273 |
+
betas: [0.9, 0.98]
|
| 274 |
+
capturable: False
|
| 275 |
+
differentiable: False
|
| 276 |
+
eps: 1e-08
|
| 277 |
+
foreach: None
|
| 278 |
+
fused: None
|
| 279 |
+
initial_lr: 1e-05
|
| 280 |
+
lr: 6.0032e-06
|
| 281 |
+
maximize: False
|
| 282 |
+
weight_decay: 0
|
| 283 |
+
)
|
| 284 |
+
[gpue06] 2025-06-04 20:22:27,538 (abs_task:1425) INFO: Scheduler: TristageLR(warmup_steps=1250)(hold_steps=5000)(decay_steps=6250)(init_lr_scale=0.6)(final_lr_scale=0.1)(decay_factor=0.00036841361487904725)
|
| 285 |
+
[gpue06] 2025-06-04 20:22:27,544 (abs_task:1434) INFO: Saving the configuration in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/config.yaml
|
| 286 |
+
[gpue06] 2025-06-04 20:22:27,823 (preprocessor:2245) INFO: Using lang2vec geo
|
| 287 |
+
[gpue06] 2025-06-04 20:22:43,726 (abs_task:1899) WARNING: Reading dump/raw/train_all_no_filter_lang/category2utt
|
| 288 |
+
[gpue06] 2025-06-04 20:22:43,727 (abs_task:1946) WARNING: Reading dump/raw/train_all_no_filter_lang/dataset2utt
|
| 289 |
+
[gpue06] 2025-06-04 20:22:43,729 (abs_task:1962) WARNING: Reading dump/raw/train_all_no_filter_lang/utt2dataset
|
| 290 |
+
[gpue06] 2025-06-04 20:24:57,630 (abs_task:1997) INFO: [train] dataset:
|
| 291 |
+
ESPnetDataset(
|
| 292 |
+
speech: {"path": "dump/raw/train_all_no_filter_lang/wav.scp", "type": "sound"}
|
| 293 |
+
lid_labels: {"path": "dump/raw/train_all_no_filter_lang/utt2spk", "type": "text"}
|
| 294 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 295 |
+
[gpue06] 2025-06-04 20:24:57,648 (abs_task:1998) INFO: [train] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 296 |
+
[gpue06] 2025-06-04 20:24:57,648 (abs_task:1999) INFO: [train] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 297 |
+
[gpue06] 2025-06-04 20:24:57,649 (abs_task:2000) INFO: [train] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=727472, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 298 |
+
[gpue06] 2025-06-04 20:24:57,715 (abs_task:2001) INFO: [train] mini-batch sizes summary: N-batch=727472, mean=6.0, min=1, max=6
|
| 299 |
+
[gpue06] 2025-06-04 20:24:58,116 (preprocessor:2245) INFO: Using lang2vec geo
|
| 300 |
+
[gpue06] 2025-06-04 20:25:10,662 (abs_task:1899) WARNING: Reading dump/raw/dev_ml_superb2_lang/category2utt
|
| 301 |
+
[gpue06] 2025-06-04 20:25:10,664 (abs_task:1946) WARNING: Reading dump/raw/dev_ml_superb2_lang/dataset2utt
|
| 302 |
+
[gpue06] 2025-06-04 20:25:10,666 (abs_task:1962) WARNING: Reading dump/raw/dev_ml_superb2_lang/utt2dataset
|
| 303 |
+
[gpue06] 2025-06-04 20:25:11,695 (abs_task:1997) INFO: [valid] dataset:
|
| 304 |
+
ESPnetDataset(
|
| 305 |
+
speech: {"path": "dump/raw/dev_ml_superb2_lang/wav.scp", "type": "sound"}
|
| 306 |
+
lid_labels: {"path": "dump/raw/dev_ml_superb2_lang/utt2spk", "type": "text"}
|
| 307 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 308 |
+
[gpue06] 2025-06-04 20:25:11,696 (abs_task:1998) INFO: [valid] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 309 |
+
[gpue06] 2025-06-04 20:25:11,696 (abs_task:1999) INFO: [valid] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 310 |
+
[gpue06] 2025-06-04 20:25:11,696 (abs_task:2000) INFO: [valid] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=4722, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 311 |
+
[gpue06] 2025-06-04 20:25:11,696 (abs_task:2001) INFO: [valid] mini-batch sizes summary: N-batch=4722, mean=6.0, min=4, max=6
|
| 312 |
+
wandb: Currently logged in as: qingzhew (qingzhew-carnegie-mellon-university) to https://api.wandb.ai. Use `wandb login --relogin` to force relogin
|
| 313 |
+
wandb: Tracking run with wandb version 0.19.10
|
| 314 |
+
wandb: Run data is saved locally in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/wandb/run-20250604_202512-0zfdmaq1
|
| 315 |
+
wandb: Run `wandb offline` to turn off syncing.
|
| 316 |
+
wandb: Resuming run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch
|
| 317 |
+
wandb: ⭐️ View project at https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 318 |
+
wandb: 🚀 View run at https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/0zfdmaq1
|
| 319 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:218: FutureWarning: `torch.cuda.amp.GradScaler(args...)` is deprecated. Please use `torch.amp.GradScaler('cuda', args...)` instead.
|
| 320 |
+
scaler = GradScaler()
|
| 321 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:159: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 322 |
+
states = torch.load(
|
| 323 |
+
[gpue06] 2025-06-04 20:25:22,045 (trainer:176) INFO: The training was resumed using exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/checkpoint.pth
|
| 324 |
+
[gpue06] 2025-06-04 20:25:22,123 (trainer:251) INFO: Frontend featurizer weights for each layer:
|
| 325 |
+
Parameter containing:
|
| 326 |
+
tensor([-0.0056, -0.0141, -0.0168, -0.0187, -0.0203, -0.0225, -0.0231, -0.0246,
|
| 327 |
+
-0.0253, -0.0252, -0.0254, -0.0241, -0.0226, -0.0200, -0.0162, -0.0120,
|
| 328 |
+
-0.0095, -0.0059, -0.0017, 0.0058, 0.0097, 0.0142, 0.0175, 0.0196,
|
| 329 |
+
0.0211, 0.0224, 0.0228, 0.0230, 0.0226, 0.0224, 0.0215, 0.0210,
|
| 330 |
+
0.0196, 0.0176, 0.0157, 0.0126, 0.0095, 0.0070, 0.0051, 0.0037,
|
| 331 |
+
0.0020, -0.0003, -0.0030, -0.0056, -0.0076, -0.0090, -0.0096, -0.0102,
|
| 332 |
+
-0.0102], device='cuda:0', requires_grad=True)
|
| 333 |
+
[gpue06] 2025-06-04 20:25:22,124 (trainer:267) INFO: Error: 'Linear' object is not subscriptable
|
| 334 |
+
[gpue06] 2025-06-04 20:25:22,124 (trainer:272) INFO: cos_mp: 1.0
|
| 335 |
+
[gpue06] 2025-06-04 20:25:22,124 (trainer:273) INFO: easy_margin: False
|
| 336 |
+
[gpue06] 2025-06-04 20:25:22,124 (trainer:281) WARNING: The training has already reached at max_epoch: 34
|
| 337 |
+
[gpue06] 2025-06-04 20:25:22,135 (trainer:541) INFO: The training was finished at 33 epochs
|
| 338 |
+
[gpue06] 2025-06-04 20:25:22,136 (average_nbest_models:69) INFO: Averaging 2best models: criterion="valid.accuracy": exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/valid.accuracy.ave_2best.pth
|
| 339 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/main_funcs/average_nbest_models.py:77: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 340 |
+
_loaded[e] = torch.load(
|
| 341 |
+
[gpue06] 2025-06-04 20:25:27,720 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.32.attention.2.num_batches_tracked instead of averaging
|
| 342 |
+
[gpue06] 2025-06-04 20:25:27,721 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.36.attention.2.num_batches_tracked instead of averaging
|
| 343 |
+
[gpue06] 2025-06-04 20:25:27,722 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.40.attention.2.num_batches_tracked instead of averaging
|
| 344 |
+
[gpue06] 2025-06-04 20:25:27,723 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.44.attention.2.num_batches_tracked instead of averaging
|
| 345 |
+
[gpue06] 2025-06-04 20:25:27,723 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.32.bn.num_batches_tracked instead of averaging
|
| 346 |
+
[gpue06] 2025-06-04 20:25:27,723 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.36.bn.num_batches_tracked instead of averaging
|
| 347 |
+
[gpue06] 2025-06-04 20:25:27,724 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.40.bn.num_batches_tracked instead of averaging
|
| 348 |
+
[gpue06] 2025-06-04 20:25:27,724 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.44.bn.num_batches_tracked instead of averaging
|
| 349 |
+
[gpue06] 2025-06-04 20:25:27,727 (average_nbest_models:96) INFO: Accumulating encoder.bn.num_batches_tracked instead of averaging
|
| 350 |
+
[gpue06] 2025-06-04 20:25:27,727 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bn1.num_batches_tracked instead of averaging
|
| 351 |
+
[gpue06] 2025-06-04 20:25:27,727 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.0.num_batches_tracked instead of averaging
|
| 352 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.1.num_batches_tracked instead of averaging
|
| 353 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.2.num_batches_tracked instead of averaging
|
| 354 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.3.num_batches_tracked instead of averaging
|
| 355 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.4.num_batches_tracked instead of averaging
|
| 356 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.5.num_batches_tracked instead of averaging
|
| 357 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.6.num_batches_tracked instead of averaging
|
| 358 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bn3.num_batches_tracked instead of averaging
|
| 359 |
+
[gpue06] 2025-06-04 20:25:27,728 (average_nbest_models:96) INFO: Accumulating encoder.layer1.se.se.3.num_batches_tracked instead of averaging
|
| 360 |
+
[gpue06] 2025-06-04 20:25:27,729 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bn1.num_batches_tracked instead of averaging
|
| 361 |
+
[gpue06] 2025-06-04 20:25:27,729 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.0.num_batches_tracked instead of averaging
|
| 362 |
+
[gpue06] 2025-06-04 20:25:27,729 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.1.num_batches_tracked instead of averaging
|
| 363 |
+
[gpue06] 2025-06-04 20:25:27,729 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.2.num_batches_tracked instead of averaging
|
| 364 |
+
[gpue06] 2025-06-04 20:25:27,729 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.3.num_batches_tracked instead of averaging
|
| 365 |
+
[gpue06] 2025-06-04 20:25:27,729 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.4.num_batches_tracked instead of averaging
|
| 366 |
+
[gpue06] 2025-06-04 20:25:27,730 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.5.num_batches_tracked instead of averaging
|
| 367 |
+
[gpue06] 2025-06-04 20:25:27,730 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.6.num_batches_tracked instead of averaging
|
| 368 |
+
[gpue06] 2025-06-04 20:25:27,730 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bn3.num_batches_tracked instead of averaging
|
| 369 |
+
[gpue06] 2025-06-04 20:25:27,730 (average_nbest_models:96) INFO: Accumulating encoder.layer2.se.se.3.num_batches_tracked instead of averaging
|
| 370 |
+
[gpue06] 2025-06-04 20:25:27,730 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bn1.num_batches_tracked instead of averaging
|
| 371 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.0.num_batches_tracked instead of averaging
|
| 372 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.1.num_batches_tracked instead of averaging
|
| 373 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.2.num_batches_tracked instead of averaging
|
| 374 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.3.num_batches_tracked instead of averaging
|
| 375 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.4.num_batches_tracked instead of averaging
|
| 376 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.5.num_batches_tracked instead of averaging
|
| 377 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.6.num_batches_tracked instead of averaging
|
| 378 |
+
[gpue06] 2025-06-04 20:25:27,731 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bn3.num_batches_tracked instead of averaging
|
| 379 |
+
[gpue06] 2025-06-04 20:25:27,732 (average_nbest_models:96) INFO: Accumulating encoder.layer3.se.se.3.num_batches_tracked instead of averaging
|
| 380 |
+
[gpue06] 2025-06-04 20:25:27,733 (average_nbest_models:96) INFO: Accumulating pooling.attention.2.num_batches_tracked instead of averaging
|
| 381 |
+
[gpue06] 2025-06-04 20:25:27,733 (average_nbest_models:96) INFO: Accumulating projector.bn.num_batches_tracked instead of averaging
|
| 382 |
+
wandb:
|
| 383 |
+
wandb:
|
| 384 |
+
wandb: Run summary:
|
| 385 |
+
wandb: epoch 50
|
| 386 |
+
wandb: iteration 100000
|
| 387 |
+
wandb: metrics/accuracy 0.95477
|
| 388 |
+
wandb: metrics/backward_time 0.96399
|
| 389 |
+
wandb: metrics/class_loss 1.09721
|
| 390 |
+
wandb: metrics/clip 0
|
| 391 |
+
wandb: metrics/forward_time 0.28474
|
| 392 |
+
wandb: metrics/geo_loss_all 0.10049
|
| 393 |
+
wandb: metrics/geo_loss_downstream 0.15867
|
| 394 |
+
wandb: metrics/grad_norm 59.37694
|
| 395 |
+
wandb: metrics/inter_geo_loss_layer32 0.01366
|
| 396 |
+
wandb: metrics/inter_geo_loss_layer36 0.01344
|
| 397 |
+
wandb: metrics/inter_geo_loss_layer40 0.01298
|
| 398 |
+
wandb: metrics/inter_geo_loss_layer44 0.01279
|
| 399 |
+
wandb: metrics/inter_geo_loss_mean 0.01322
|
| 400 |
+
wandb: metrics/iter_time 0.00022
|
| 401 |
+
wandb: metrics/loss 0.22447
|
| 402 |
+
wandb: metrics/loss_scale 268435456
|
| 403 |
+
wandb: metrics/optim0_lr0 0.0
|
| 404 |
+
wandb: metrics/optim_step_time 0.03651
|
| 405 |
+
wandb: train/train_accuracy_epoch 0.95477
|
| 406 |
+
wandb: train/train_backward_time_epoch 0.96399
|
| 407 |
+
wandb: train/train_class_loss_epoch 1.09721
|
| 408 |
+
wandb: train/train_clip_epoch 0
|
| 409 |
+
wandb: train/train_forward_time_epoch 0.28474
|
| 410 |
+
wandb: train/train_geo_loss_all_epoch 0.10049
|
| 411 |
+
wandb: train/train_geo_loss_downstream_epoch 0.15867
|
| 412 |
+
wandb: train/train_gpu_max_cached_mem_GB_epoch 130.68359
|
| 413 |
+
wandb: train/train_grad_norm_epoch 59.37694
|
| 414 |
+
wandb: train/train_inter_geo_loss_layer32_epoch 0.01366
|
| 415 |
+
wandb: train/train_inter_geo_loss_layer36_epoch 0.01344
|
| 416 |
+
wandb: train/train_inter_geo_loss_layer40_epoch 0.01298
|
| 417 |
+
wandb: train/train_inter_geo_loss_layer44_epoch 0.01279
|
| 418 |
+
wandb: train/train_inter_geo_loss_mean_epoch 0.01322
|
| 419 |
+
wandb: train/train_iter_time_epoch 0.00022
|
| 420 |
+
wandb: train/train_loss_epoch 0.22447
|
| 421 |
+
wandb: train/train_loss_scale_epoch 268435456
|
| 422 |
+
wandb: train/train_optim0_lr0_epoch 0.0
|
| 423 |
+
wandb: train/train_optim_step_time_epoch 0.03651
|
| 424 |
+
wandb: train/train_time 5.07792
|
| 425 |
+
wandb: train/train_train_time_epoch 5.07792
|
| 426 |
+
wandb: valid/valid_accuracy_epoch 0.89594
|
| 427 |
+
wandb: valid/valid_class_loss_epoch 2.57223
|
| 428 |
+
wandb: valid/valid_geo_loss_all_epoch 0.13273
|
| 429 |
+
wandb: valid/valid_geo_loss_downstream_epoch 0.20731
|
| 430 |
+
wandb: valid/valid_gpu_max_cached_mem_GB_epoch 130.68359
|
| 431 |
+
wandb: valid/valid_inter_geo_loss_layer32_epoch 0.01976
|
| 432 |
+
wandb: valid/valid_inter_geo_loss_layer36_epoch 0.02211
|
| 433 |
+
wandb: valid/valid_inter_geo_loss_layer40_epoch 0.02097
|
| 434 |
+
wandb: valid/valid_inter_geo_loss_layer44_epoch 0.02058
|
| 435 |
+
wandb: valid/valid_inter_geo_loss_mean_epoch 0.02086
|
| 436 |
+
wandb: valid/valid_loss_epoch 2.08433
|
| 437 |
+
wandb:
|
| 438 |
+
wandb: 🚀 View run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch at: https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/0zfdmaq1
|
| 439 |
+
wandb: ⭐️ View project at: https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 440 |
+
wandb: Synced 5 W&B file(s), 0 media file(s), 0 artifact file(s) and 0 other file(s)
|
| 441 |
+
wandb: Find logs at: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/wandb/run-20250604_202512-0zfdmaq1/logs
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.3.log
ADDED
|
@@ -0,0 +1,460 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_train --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Mon Jun 2 08:00:04 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_train.py --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 6 |
+
torchaudio.set_audio_backend("sox_io")
|
| 7 |
+
[gpue01] 2025-06-02 08:00:37,184 (abs_task:1420) INFO: pytorch.version=2.4.0+cu118, cuda.available=True, cudnn.version=90100, cudnn.benchmark=True, cudnn.deterministic=False
|
| 8 |
+
[gpue01] 2025-06-02 08:00:37,190 (abs_task:1421) INFO: Model structure:
|
| 9 |
+
ESPnetLIDUpstreamConditionModel(
|
| 10 |
+
(frontend): S3prlFrontendCondition(
|
| 11 |
+
(upstream): S3PRLUpstreamCondition(
|
| 12 |
+
(upstream): UpstreamExpertCondition(
|
| 13 |
+
(model): Wav2Vec2ModelCondition(
|
| 14 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 15 |
+
(conv_layers): ModuleList(
|
| 16 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 17 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 18 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 19 |
+
(activation): GELUActivation()
|
| 20 |
+
)
|
| 21 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 22 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 23 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 24 |
+
(activation): GELUActivation()
|
| 25 |
+
)
|
| 26 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 27 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 28 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 29 |
+
(activation): GELUActivation()
|
| 30 |
+
)
|
| 31 |
+
)
|
| 32 |
+
)
|
| 33 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 34 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 35 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 36 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 37 |
+
)
|
| 38 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 39 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 40 |
+
(conv): ParametrizedConv1d(
|
| 41 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 42 |
+
(parametrizations): ModuleDict(
|
| 43 |
+
(weight): ParametrizationList(
|
| 44 |
+
(0): _WeightNorm()
|
| 45 |
+
)
|
| 46 |
+
)
|
| 47 |
+
)
|
| 48 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 49 |
+
(activation): GELUActivation()
|
| 50 |
+
)
|
| 51 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 52 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 53 |
+
(layers): ModuleList(
|
| 54 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 55 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 56 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 57 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 58 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
)
|
| 61 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 62 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 63 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 64 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 65 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 66 |
+
(intermediate_act_fn): GELUActivation()
|
| 67 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 68 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 69 |
+
)
|
| 70 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 71 |
+
)
|
| 72 |
+
)
|
| 73 |
+
(ecapa_encoder): ModuleDict(
|
| 74 |
+
(32): IdentityEncoder()
|
| 75 |
+
(36): IdentityEncoder()
|
| 76 |
+
(40): IdentityEncoder()
|
| 77 |
+
(44): IdentityEncoder()
|
| 78 |
+
)
|
| 79 |
+
(pooling): ModuleDict(
|
| 80 |
+
(32): ChnAttnStatPooling(
|
| 81 |
+
(attention): Sequential(
|
| 82 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 83 |
+
(1): ReLU()
|
| 84 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 85 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 86 |
+
)
|
| 87 |
+
(softmax): Softmax(dim=2)
|
| 88 |
+
)
|
| 89 |
+
(36): ChnAttnStatPooling(
|
| 90 |
+
(attention): Sequential(
|
| 91 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 92 |
+
(1): ReLU()
|
| 93 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 94 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 95 |
+
)
|
| 96 |
+
(softmax): Softmax(dim=2)
|
| 97 |
+
)
|
| 98 |
+
(40): ChnAttnStatPooling(
|
| 99 |
+
(attention): Sequential(
|
| 100 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 101 |
+
(1): ReLU()
|
| 102 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 103 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 104 |
+
)
|
| 105 |
+
(softmax): Softmax(dim=2)
|
| 106 |
+
)
|
| 107 |
+
(44): ChnAttnStatPooling(
|
| 108 |
+
(attention): Sequential(
|
| 109 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 110 |
+
(1): ReLU()
|
| 111 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 112 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 113 |
+
)
|
| 114 |
+
(softmax): Softmax(dim=2)
|
| 115 |
+
)
|
| 116 |
+
)
|
| 117 |
+
(projector): ModuleDict(
|
| 118 |
+
(32): RawNet3Projector(
|
| 119 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 120 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 121 |
+
)
|
| 122 |
+
(36): RawNet3Projector(
|
| 123 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 124 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 125 |
+
)
|
| 126 |
+
(40): RawNet3Projector(
|
| 127 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 128 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 129 |
+
)
|
| 130 |
+
(44): RawNet3Projector(
|
| 131 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 132 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 133 |
+
)
|
| 134 |
+
)
|
| 135 |
+
(lang2vec_head): ModuleDict(
|
| 136 |
+
(32): Sequential(
|
| 137 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 138 |
+
)
|
| 139 |
+
(36): Sequential(
|
| 140 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 141 |
+
)
|
| 142 |
+
(40): Sequential(
|
| 143 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 144 |
+
)
|
| 145 |
+
(44): Sequential(
|
| 146 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 147 |
+
)
|
| 148 |
+
)
|
| 149 |
+
(aamsoftmax_weight): ParameterDict()
|
| 150 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 151 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 152 |
+
(ce): CrossEntropyLoss()
|
| 153 |
+
(lang2vec_head): Sequential(
|
| 154 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 155 |
+
)
|
| 156 |
+
(lang2vec_loss): MSELoss()
|
| 157 |
+
)
|
| 158 |
+
)
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
(featurizer): Featurizer()
|
| 163 |
+
)
|
| 164 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 165 |
+
(encoder): EcapaTdnnEncoder(
|
| 166 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 167 |
+
(relu): ReLU()
|
| 168 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 169 |
+
(layer1): EcapaBlock(
|
| 170 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 171 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 172 |
+
(convs): ModuleList(
|
| 173 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 174 |
+
)
|
| 175 |
+
(bns): ModuleList(
|
| 176 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 177 |
+
)
|
| 178 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 179 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 180 |
+
(relu): ReLU()
|
| 181 |
+
(se): SEModule(
|
| 182 |
+
(se): Sequential(
|
| 183 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 184 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 185 |
+
(2): ReLU()
|
| 186 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 187 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 188 |
+
(5): Sigmoid()
|
| 189 |
+
)
|
| 190 |
+
)
|
| 191 |
+
)
|
| 192 |
+
(layer2): EcapaBlock(
|
| 193 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 194 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 195 |
+
(convs): ModuleList(
|
| 196 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 197 |
+
)
|
| 198 |
+
(bns): ModuleList(
|
| 199 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 200 |
+
)
|
| 201 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 202 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 203 |
+
(relu): ReLU()
|
| 204 |
+
(se): SEModule(
|
| 205 |
+
(se): Sequential(
|
| 206 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 207 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 208 |
+
(2): ReLU()
|
| 209 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 210 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 211 |
+
(5): Sigmoid()
|
| 212 |
+
)
|
| 213 |
+
)
|
| 214 |
+
)
|
| 215 |
+
(layer3): EcapaBlock(
|
| 216 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 217 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 218 |
+
(convs): ModuleList(
|
| 219 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 220 |
+
)
|
| 221 |
+
(bns): ModuleList(
|
| 222 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 223 |
+
)
|
| 224 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 225 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 226 |
+
(relu): ReLU()
|
| 227 |
+
(se): SEModule(
|
| 228 |
+
(se): Sequential(
|
| 229 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 230 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 231 |
+
(2): ReLU()
|
| 232 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 233 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 234 |
+
(5): Sigmoid()
|
| 235 |
+
)
|
| 236 |
+
)
|
| 237 |
+
)
|
| 238 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 239 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 240 |
+
)
|
| 241 |
+
(pooling): ChnAttnStatPooling(
|
| 242 |
+
(attention): Sequential(
|
| 243 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 244 |
+
(1): ReLU()
|
| 245 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 246 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 247 |
+
)
|
| 248 |
+
(softmax): Softmax(dim=2)
|
| 249 |
+
)
|
| 250 |
+
(projector): RawNet3Projector(
|
| 251 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 252 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 253 |
+
)
|
| 254 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 255 |
+
(ce): CrossEntropyLoss()
|
| 256 |
+
(lang2vec_head): Sequential(
|
| 257 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 258 |
+
)
|
| 259 |
+
(lang2vec_loss): MSELoss()
|
| 260 |
+
)
|
| 261 |
+
)
|
| 262 |
+
|
| 263 |
+
Model summary:
|
| 264 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 265 |
+
Total Number of model parameters: 977.14 M
|
| 266 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 267 |
+
Size: 3.91 GB
|
| 268 |
+
Type: torch.float32
|
| 269 |
+
[gpue01] 2025-06-02 08:00:37,190 (abs_task:1424) INFO: Optimizer:
|
| 270 |
+
Adam (
|
| 271 |
+
Parameter Group 0
|
| 272 |
+
amsgrad: False
|
| 273 |
+
betas: [0.9, 0.98]
|
| 274 |
+
capturable: False
|
| 275 |
+
differentiable: False
|
| 276 |
+
eps: 1e-08
|
| 277 |
+
foreach: None
|
| 278 |
+
fused: None
|
| 279 |
+
initial_lr: 1e-05
|
| 280 |
+
lr: 6.0032e-06
|
| 281 |
+
maximize: False
|
| 282 |
+
weight_decay: 0
|
| 283 |
+
)
|
| 284 |
+
[gpue01] 2025-06-02 08:00:37,190 (abs_task:1425) INFO: Scheduler: TristageLR(warmup_steps=1250)(hold_steps=5000)(decay_steps=6250)(init_lr_scale=0.6)(final_lr_scale=0.1)(decay_factor=0.00036841361487904725)
|
| 285 |
+
[gpue01] 2025-06-02 08:00:37,195 (abs_task:1434) INFO: Saving the configuration in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/config.yaml
|
| 286 |
+
[gpue01] 2025-06-02 08:00:37,476 (preprocessor:2245) INFO: Using lang2vec geo
|
| 287 |
+
[gpue01] 2025-06-02 08:00:53,379 (abs_task:1899) WARNING: Reading dump/raw/train_all_no_filter_lang/category2utt
|
| 288 |
+
[gpue01] 2025-06-02 08:00:53,380 (abs_task:1946) WARNING: Reading dump/raw/train_all_no_filter_lang/dataset2utt
|
| 289 |
+
[gpue01] 2025-06-02 08:00:53,382 (abs_task:1962) WARNING: Reading dump/raw/train_all_no_filter_lang/utt2dataset
|
| 290 |
+
[gpue01] 2025-06-02 08:03:08,576 (abs_task:1997) INFO: [train] dataset:
|
| 291 |
+
ESPnetDataset(
|
| 292 |
+
speech: {"path": "dump/raw/train_all_no_filter_lang/wav.scp", "type": "sound"}
|
| 293 |
+
lid_labels: {"path": "dump/raw/train_all_no_filter_lang/utt2spk", "type": "text"}
|
| 294 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 295 |
+
[gpue01] 2025-06-02 08:03:08,577 (abs_task:1998) INFO: [train] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 296 |
+
[gpue01] 2025-06-02 08:03:08,577 (abs_task:1999) INFO: [train] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 297 |
+
[gpue01] 2025-06-02 08:03:08,577 (abs_task:2000) INFO: [train] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=727460, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 298 |
+
[gpue01] 2025-06-02 08:03:08,642 (abs_task:2001) INFO: [train] mini-batch sizes summary: N-batch=727460, mean=6.0, min=1, max=6
|
| 299 |
+
[gpue01] 2025-06-02 08:03:09,071 (preprocessor:2245) INFO: Using lang2vec geo
|
| 300 |
+
[gpue01] 2025-06-02 08:03:21,631 (abs_task:1899) WARNING: Reading dump/raw/dev_ml_superb2_lang/category2utt
|
| 301 |
+
[gpue01] 2025-06-02 08:03:21,632 (abs_task:1946) WARNING: Reading dump/raw/dev_ml_superb2_lang/dataset2utt
|
| 302 |
+
[gpue01] 2025-06-02 08:03:21,633 (abs_task:1962) WARNING: Reading dump/raw/dev_ml_superb2_lang/utt2dataset
|
| 303 |
+
[gpue01] 2025-06-02 08:03:22,657 (abs_task:1997) INFO: [valid] dataset:
|
| 304 |
+
ESPnetDataset(
|
| 305 |
+
speech: {"path": "dump/raw/dev_ml_superb2_lang/wav.scp", "type": "sound"}
|
| 306 |
+
lid_labels: {"path": "dump/raw/dev_ml_superb2_lang/utt2spk", "type": "text"}
|
| 307 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 308 |
+
[gpue01] 2025-06-02 08:03:22,657 (abs_task:1998) INFO: [valid] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 309 |
+
[gpue01] 2025-06-02 08:03:22,657 (abs_task:1999) INFO: [valid] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 310 |
+
[gpue01] 2025-06-02 08:03:22,657 (abs_task:2000) INFO: [valid] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=4722, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 311 |
+
[gpue01] 2025-06-02 08:03:22,658 (abs_task:2001) INFO: [valid] mini-batch sizes summary: N-batch=4722, mean=6.0, min=4, max=6
|
| 312 |
+
wandb: Currently logged in as: qingzhew (qingzhew-carnegie-mellon-university) to https://api.wandb.ai. Use `wandb login --relogin` to force relogin
|
| 313 |
+
wandb: Tracking run with wandb version 0.19.10
|
| 314 |
+
wandb: Run data is saved locally in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/wandb/run-20250602_080323-0zfdmaq1
|
| 315 |
+
wandb: Run `wandb offline` to turn off syncing.
|
| 316 |
+
wandb: Resuming run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3
|
| 317 |
+
wandb: ⭐️ View project at https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 318 |
+
wandb: 🚀 View run at https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/0zfdmaq1
|
| 319 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:218: FutureWarning: `torch.cuda.amp.GradScaler(args...)` is deprecated. Please use `torch.amp.GradScaler('cuda', args...)` instead.
|
| 320 |
+
scaler = GradScaler()
|
| 321 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:159: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 322 |
+
states = torch.load(
|
| 323 |
+
[gpue01] 2025-06-02 08:03:32,100 (trainer:176) INFO: The training was resumed using exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/checkpoint.pth
|
| 324 |
+
[gpue01] 2025-06-02 08:03:32,199 (trainer:251) INFO: Frontend featurizer weights for each layer:
|
| 325 |
+
Parameter containing:
|
| 326 |
+
tensor([-0.0056, -0.0140, -0.0167, -0.0186, -0.0202, -0.0224, -0.0230, -0.0245,
|
| 327 |
+
-0.0252, -0.0250, -0.0253, -0.0240, -0.0225, -0.0199, -0.0161, -0.0120,
|
| 328 |
+
-0.0094, -0.0058, -0.0017, 0.0059, 0.0098, 0.0142, 0.0175, 0.0197,
|
| 329 |
+
0.0211, 0.0224, 0.0228, 0.0230, 0.0225, 0.0223, 0.0215, 0.0209,
|
| 330 |
+
0.0195, 0.0176, 0.0156, 0.0126, 0.0094, 0.0070, 0.0050, 0.0036,
|
| 331 |
+
0.0019, -0.0004, -0.0031, -0.0057, -0.0077, -0.0090, -0.0097, -0.0103,
|
| 332 |
+
-0.0103], device='cuda:0', requires_grad=True)
|
| 333 |
+
[gpue01] 2025-06-02 08:03:32,200 (trainer:267) INFO: Error: 'Linear' object is not subscriptable
|
| 334 |
+
[gpue01] 2025-06-02 08:03:32,200 (trainer:272) INFO: cos_mp: 1.0
|
| 335 |
+
[gpue01] 2025-06-02 08:03:32,200 (trainer:273) INFO: easy_margin: False
|
| 336 |
+
[gpue01] 2025-06-02 08:03:32,211 (trainer:347) INFO: 29/50epoch started
|
| 337 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:645: FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.
|
| 338 |
+
with autocast(
|
| 339 |
+
[gpue01] 2025-06-02 08:08:32,675 (trainer:816) INFO: 29epoch:train:1-100batch: iter_time=0.003, forward_time=0.394, class_loss=1.038, geo_loss_downstream=0.166, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.213, accuracy=0.960, backward_time=1.214, grad_norm=40.433, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=5.725e-07, train_time=6.534
|
| 340 |
+
[gpue01] 2025-06-02 08:10:53,126 (trainer:816) INFO: 29epoch:train:101-200batch: iter_time=8.669e-05, forward_time=0.352, class_loss=1.234, geo_loss_downstream=0.167, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.106, loss=0.252, accuracy=0.947, backward_time=1.033, grad_norm=67.550, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=5.672e-07, train_time=5.618
|
| 341 |
+
[gpue01] 2025-06-02 08:13:06,132 (trainer:816) INFO: 29epoch:train:201-300batch: iter_time=8.747e-05, forward_time=0.336, class_loss=1.151, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.012, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.235, accuracy=0.948, backward_time=0.973, grad_norm=55.134, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.620e-07, train_time=5.320
|
| 342 |
+
[gpue01] 2025-06-02 08:15:18,146 (trainer:816) INFO: 29epoch:train:301-400batch: iter_time=9.459e-05, forward_time=0.338, class_loss=1.302, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.266, accuracy=0.950, backward_time=0.962, grad_norm=56.978, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=5.569e-07, train_time=5.280
|
| 343 |
+
[gpue01] 2025-06-02 08:17:18,134 (trainer:816) INFO: 29epoch:train:401-500batch: iter_time=9.884e-05, forward_time=0.300, class_loss=1.251, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.255, accuracy=0.945, backward_time=0.879, grad_norm=50.561, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.518e-07, train_time=4.799
|
| 344 |
+
[gpue01] 2025-06-02 08:19:20,112 (trainer:816) INFO: 29epoch:train:501-600batch: iter_time=9.952e-05, forward_time=0.288, class_loss=0.865, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.178, accuracy=0.968, backward_time=0.911, grad_norm=51.561, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=5.467e-07, train_time=4.879
|
| 345 |
+
[gpue01] 2025-06-02 08:21:24,824 (trainer:816) INFO: 29epoch:train:601-700batch: iter_time=8.929e-05, forward_time=0.286, class_loss=0.922, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.190, accuracy=0.962, backward_time=0.941, grad_norm=48.480, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.417e-07, train_time=4.988
|
| 346 |
+
[gpue01] 2025-06-02 08:23:15,407 (trainer:816) INFO: 29epoch:train:701-800batch: iter_time=1.002e-04, forward_time=0.255, class_loss=1.501, geo_loss_downstream=0.166, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.305, accuracy=0.943, backward_time=0.830, grad_norm=74.153, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.367e-07, train_time=4.423
|
| 347 |
+
[gpue01] 2025-06-02 08:25:12,844 (trainer:816) INFO: 29epoch:train:801-900batch: iter_time=9.754e-05, forward_time=0.252, class_loss=1.060, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.217, accuracy=0.957, backward_time=0.902, grad_norm=54.887, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=5.318e-07, train_time=4.697
|
| 348 |
+
[gpue01] 2025-06-02 08:27:12,890 (trainer:816) INFO: 29epoch:train:901-1000batch: iter_time=9.747e-05, forward_time=0.244, class_loss=1.193, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.244, accuracy=0.952, backward_time=0.937, grad_norm=42.200, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.269e-07, train_time=4.801
|
| 349 |
+
[gpue01] 2025-06-02 08:29:03,462 (trainer:816) INFO: 29epoch:train:1001-1100batch: iter_time=9.858e-05, forward_time=0.249, class_loss=1.149, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.235, accuracy=0.953, backward_time=0.835, grad_norm=58.891, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.221e-07, train_time=4.422
|
| 350 |
+
[gpue01] 2025-06-02 08:31:01,649 (trainer:816) INFO: 29epoch:train:1101-1200batch: iter_time=1.007e-04, forward_time=0.238, class_loss=1.181, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.241, accuracy=0.949, backward_time=0.922, grad_norm=100.141, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.173e-07, train_time=4.727
|
| 351 |
+
[gpue01] 2025-06-02 08:32:49,871 (trainer:816) INFO: 29epoch:train:1201-1300batch: iter_time=9.155e-05, forward_time=0.239, class_loss=1.129, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.231, accuracy=0.948, backward_time=0.822, grad_norm=46.727, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.126e-07, train_time=4.328
|
| 352 |
+
[gpue01] 2025-06-02 08:34:45,411 (trainer:816) INFO: 29epoch:train:1301-1400batch: iter_time=9.736e-05, forward_time=0.264, class_loss=1.095, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.224, accuracy=0.952, backward_time=0.870, grad_norm=40.216, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=5.079e-07, train_time=4.621
|
| 353 |
+
[gpue01] 2025-06-02 08:36:35,310 (trainer:816) INFO: 29epoch:train:1401-1500batch: iter_time=9.289e-05, forward_time=0.240, class_loss=1.256, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.015, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.256, accuracy=0.948, backward_time=0.837, grad_norm=65.161, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.032e-07, train_time=4.395
|
| 354 |
+
[gpue01] 2025-06-02 08:38:33,818 (trainer:816) INFO: 29epoch:train:1501-1600batch: iter_time=9.491e-05, forward_time=0.265, class_loss=1.038, geo_loss_downstream=0.166, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.213, accuracy=0.965, backward_time=0.900, grad_norm=51.130, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.986e-07, train_time=4.740
|
| 355 |
+
[gpue01] 2025-06-02 08:40:35,271 (trainer:816) INFO: 29epoch:train:1601-1700batch: iter_time=9.165e-05, forward_time=0.272, class_loss=0.837, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.173, accuracy=0.965, backward_time=0.921, grad_norm=52.871, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.940e-07, train_time=4.857
|
| 356 |
+
[gpue01] 2025-06-02 08:42:35,412 (trainer:816) INFO: 29epoch:train:1701-1800batch: iter_time=8.960e-05, forward_time=0.266, class_loss=1.142, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.234, accuracy=0.953, backward_time=0.915, grad_norm=50.981, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=4.895e-07, train_time=4.805
|
| 357 |
+
[gpue01] 2025-06-02 08:44:31,106 (trainer:816) INFO: 29epoch:train:1801-1900batch: iter_time=1.021e-04, forward_time=0.230, class_loss=1.361, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.277, accuracy=0.945, backward_time=0.906, grad_norm=76.043, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.038, optim0_lr0=4.850e-07, train_time=4.627
|
| 358 |
+
[gpue01] 2025-06-02 08:46:36,834 (trainer:816) INFO: 29epoch:train:1901-2000batch: iter_time=8.900e-05, forward_time=0.264, class_loss=1.395, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.284, accuracy=0.945, backward_time=0.973, grad_norm=59.458, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.806e-07, train_time=5.028
|
| 359 |
+
[gpue01] 2025-06-02 09:09:51,218 (trainer:401) INFO: 29epoch results: [train] iter_time=2.479e-04, forward_time=0.279, class_loss=1.155, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.236, accuracy=0.953, backward_time=0.924, grad_norm=57.178, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=5.253e-07, train_time=4.894, time=43 minutes and 4.77 seconds, total_count=58000, gpu_max_cached_mem_GB=79.320, [valid] class_loss=2.591, geo_loss_downstream=0.207, inter_geo_loss_layer32=0.020, inter_geo_loss_layer36=0.023, inter_geo_loss_layer40=0.022, inter_geo_loss_layer44=0.022, inter_geo_loss_mean=0.022, geo_loss_all=0.133, loss=2.099, accuracy=0.894, time=23 minutes and 14.24 seconds, total_count=136938, gpu_max_cached_mem_GB=79.320
|
| 360 |
+
[gpue01] 2025-06-02 09:10:04,925 (trainer:467) INFO: There are no improvements in this epoch
|
| 361 |
+
[gpue01] 2025-06-02 09:10:04,945 (trainer:523) INFO: The model files were removed: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/27epoch.pth, exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/28epoch.pth
|
| 362 |
+
[gpue01] 2025-06-02 09:10:04,945 (trainer:335) INFO: 30/50epoch started. Estimated time to finish: 23 hours, 17 minutes and 27.41 seconds
|
| 363 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:645: FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast('cuda', args...)` instead.
|
| 364 |
+
with autocast(
|
| 365 |
+
[gpue01] 2025-06-02 09:14:40,846 (trainer:816) INFO: 30epoch:train:1-100batch: iter_time=0.008, forward_time=0.415, class_loss=1.226, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.250, accuracy=0.952, backward_time=0.915, grad_norm=53.097, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=4.762e-07, train_time=5.456
|
| 366 |
+
[gpue01] 2025-06-02 09:16:39,711 (trainer:816) INFO: 30epoch:train:101-200batch: iter_time=9.637e-05, forward_time=0.353, class_loss=1.158, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.237, accuracy=0.953, backward_time=0.815, grad_norm=71.507, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.718e-07, train_time=4.754
|
| 367 |
+
[gpue01] 2025-06-02 09:19:02,535 (trainer:816) INFO: 30epoch:train:201-300batch: iter_time=9.507e-05, forward_time=0.368, class_loss=1.156, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.236, accuracy=0.952, backward_time=1.041, grad_norm=46.785, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.675e-07, train_time=5.712
|
| 368 |
+
[gpue01] 2025-06-02 09:21:05,464 (trainer:816) INFO: 30epoch:train:301-400batch: iter_time=1.003e-04, forward_time=0.318, class_loss=1.121, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.229, accuracy=0.951, backward_time=0.891, grad_norm=48.383, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.632e-07, train_time=4.916
|
| 369 |
+
[gpue01] 2025-06-02 09:22:55,038 (trainer:816) INFO: 30epoch:train:401-500batch: iter_time=9.499e-05, forward_time=0.284, class_loss=1.061, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.217, accuracy=0.962, backward_time=0.792, grad_norm=77.788, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.589e-07, train_time=4.382
|
| 370 |
+
[gpue01] 2025-06-02 09:24:57,833 (trainer:816) INFO: 30epoch:train:501-600batch: iter_time=1.014e-04, forward_time=0.290, class_loss=1.085, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.222, accuracy=0.963, backward_time=0.918, grad_norm=59.019, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.547e-07, train_time=4.911
|
| 371 |
+
[gpue01] 2025-06-02 09:26:53,859 (trainer:816) INFO: 30epoch:train:601-700batch: iter_time=9.895e-05, forward_time=0.280, class_loss=1.015, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.208, accuracy=0.965, backward_time=0.858, grad_norm=55.588, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=4.506e-07, train_time=4.640
|
| 372 |
+
[gpue01] 2025-06-02 09:29:00,707 (trainer:816) INFO: 30epoch:train:701-800batch: iter_time=1.006e-04, forward_time=0.274, class_loss=1.328, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.015, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.271, accuracy=0.943, backward_time=0.974, grad_norm=65.465, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.464e-07, train_time=5.073
|
| 373 |
+
[gpue01] 2025-06-02 09:31:13,341 (trainer:816) INFO: 30epoch:train:801-900batch: iter_time=1.231e-04, forward_time=0.272, class_loss=1.336, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.272, accuracy=0.947, backward_time=1.035, grad_norm=50.503, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.423e-07, train_time=5.305
|
| 374 |
+
[gpue01] 2025-06-02 09:33:12,121 (trainer:816) INFO: 30epoch:train:901-1000batch: iter_time=1.060e-04, forward_time=0.241, class_loss=1.184, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.242, accuracy=0.955, backward_time=0.926, grad_norm=48.061, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.383e-07, train_time=4.750
|
| 375 |
+
[gpue01] 2025-06-02 09:35:05,485 (trainer:816) INFO: 30epoch:train:1001-1100batch: iter_time=1.136e-04, forward_time=0.232, class_loss=1.135, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.232, accuracy=0.948, backward_time=0.881, grad_norm=41.919, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=4.343e-07, train_time=4.534
|
| 376 |
+
[gpue01] 2025-06-02 09:37:02,151 (trainer:816) INFO: 30epoch:train:1101-1200batch: iter_time=1.137e-04, forward_time=0.245, class_loss=1.438, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.293, accuracy=0.942, backward_time=0.902, grad_norm=75.206, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=4.303e-07, train_time=4.666
|
| 377 |
+
[gpue01] 2025-06-02 09:38:59,107 (trainer:816) INFO: 30epoch:train:1201-1300batch: iter_time=1.021e-04, forward_time=0.252, class_loss=1.081, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.221, accuracy=0.962, backward_time=0.897, grad_norm=52.787, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.263e-07, train_time=4.677
|
| 378 |
+
[gpue01] 2025-06-02 09:40:58,086 (trainer:816) INFO: 30epoch:train:1301-1400batch: iter_time=1.154e-04, forward_time=0.263, class_loss=1.120, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.229, accuracy=0.955, backward_time=0.906, grad_norm=47.309, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.224e-07, train_time=4.758
|
| 379 |
+
[gpue01] 2025-06-02 09:42:58,353 (trainer:816) INFO: 30epoch:train:1401-1500batch: iter_time=1.042e-04, forward_time=0.265, class_loss=1.275, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.260, accuracy=0.942, backward_time=0.918, grad_norm=74.393, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=4.186e-07, train_time=4.810
|
| 380 |
+
[gpue01] 2025-06-02 09:44:43,549 (trainer:816) INFO: 30epoch:train:1501-1600batch: iter_time=1.006e-04, forward_time=0.231, class_loss=1.188, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.243, accuracy=0.950, backward_time=0.799, grad_norm=68.925, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=4.147e-07, train_time=4.207
|
| 381 |
+
[gpue01] 2025-06-02 09:46:43,647 (trainer:816) INFO: 30epoch:train:1601-1700batch: iter_time=9.242e-05, forward_time=0.278, class_loss=1.219, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.249, accuracy=0.947, backward_time=0.903, grad_norm=38.583, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.109e-07, train_time=4.803
|
| 382 |
+
[gpue01] 2025-06-02 09:48:51,075 (trainer:816) INFO: 30epoch:train:1701-1800batch: iter_time=1.070e-04, forward_time=0.275, class_loss=1.158, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.237, accuracy=0.957, backward_time=0.979, grad_norm=72.308, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.072e-07, train_time=5.096
|
| 383 |
+
[gpue01] 2025-06-02 09:50:42,998 (trainer:816) INFO: 30epoch:train:1801-1900batch: iter_time=9.723e-05, forward_time=0.249, class_loss=1.209, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.015, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.105, loss=0.247, accuracy=0.948, backward_time=0.849, grad_norm=54.531, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.034e-07, train_time=4.476
|
| 384 |
+
[gpue01] 2025-06-02 09:52:36,303 (trainer:816) INFO: 30epoch:train:1901-2000batch: iter_time=9.175e-05, forward_time=0.226, class_loss=1.016, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.208, accuracy=0.953, backward_time=0.887, grad_norm=55.463, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.997e-07, train_time=4.531
|
| 385 |
+
[gpue01] 2025-06-02 10:15:46,744 (trainer:401) INFO: 30epoch results: [train] iter_time=4.751e-04, forward_time=0.281, class_loss=1.175, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.240, accuracy=0.952, backward_time=0.904, grad_norm=57.881, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=4.369e-07, train_time=4.823, time=42 minutes and 31.56 seconds, total_count=60000, gpu_max_cached_mem_GB=81.803, [valid] class_loss=2.597, geo_loss_downstream=0.220, inter_geo_loss_layer32=0.020, inter_geo_loss_layer36=0.023, inter_geo_loss_layer40=0.023, inter_geo_loss_layer44=0.022, inter_geo_loss_mean=0.022, geo_loss_all=0.141, loss=2.106, accuracy=0.895, time=23 minutes and 10.24 seconds, total_count=141660, gpu_max_cached_mem_GB=81.803
|
| 386 |
+
[gpue01] 2025-06-02 10:15:59,998 (trainer:467) INFO: There are no improvements in this epoch
|
| 387 |
+
[gpue01] 2025-06-02 10:16:00,022 (trainer:523) INFO: The model files were removed: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/29epoch.pth
|
| 388 |
+
[gpue01] 2025-06-02 10:16:00,023 (trainer:335) INFO: 31/50epoch started. Estimated time to finish: 22 hours, 4 minutes and 38.11 seconds
|
| 389 |
+
[gpue01] 2025-06-02 10:20:30,892 (trainer:816) INFO: 31epoch:train:1-100batch: iter_time=0.002, forward_time=0.401, class_loss=1.160, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.015, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.237, accuracy=0.952, backward_time=0.896, grad_norm=73.411, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=3.961e-07, train_time=5.300
|
| 390 |
+
[gpue01] 2025-06-02 10:22:38,629 (trainer:816) INFO: 31epoch:train:101-200batch: iter_time=9.710e-05, forward_time=0.364, class_loss=1.277, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.261, accuracy=0.948, backward_time=0.893, grad_norm=65.669, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.924e-07, train_time=5.109
|
| 391 |
+
[gpue01] 2025-06-02 10:24:35,032 (trainer:816) INFO: 31epoch:train:201-300batch: iter_time=9.069e-05, forward_time=0.330, class_loss=1.280, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.261, accuracy=0.948, backward_time=0.813, grad_norm=67.795, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.888e-07, train_time=4.655
|
| 392 |
+
[gpue01] 2025-06-02 10:26:50,531 (trainer:816) INFO: 31epoch:train:301-400batch: iter_time=9.965e-05, forward_time=0.337, class_loss=1.468, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.299, accuracy=0.942, backward_time=0.999, grad_norm=90.267, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=3.853e-07, train_time=5.419
|
| 393 |
+
[gpue01] 2025-06-02 10:28:43,674 (trainer:816) INFO: 31epoch:train:401-500batch: iter_time=9.749e-05, forward_time=0.288, class_loss=1.018, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.209, accuracy=0.960, backward_time=0.823, grad_norm=47.089, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.817e-07, train_time=4.525
|
| 394 |
+
[gpue01] 2025-06-02 10:30:42,003 (trainer:816) INFO: 31epoch:train:501-600batch: iter_time=1.059e-04, forward_time=0.283, class_loss=1.170, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.239, accuracy=0.953, backward_time=0.879, grad_norm=56.840, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.782e-07, train_time=4.732
|
| 395 |
+
[gpue01] 2025-06-02 10:32:49,253 (trainer:816) INFO: 31epoch:train:601-700batch: iter_time=9.643e-05, forward_time=0.288, class_loss=1.159, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.237, accuracy=0.955, backward_time=0.965, grad_norm=71.334, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.748e-07, train_time=5.089
|
| 396 |
+
[gpue01] 2025-06-02 10:34:41,795 (trainer:816) INFO: 31epoch:train:701-800batch: iter_time=9.690e-05, forward_time=0.263, class_loss=1.211, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.247, accuracy=0.952, backward_time=0.841, grad_norm=58.619, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.713e-07, train_time=4.501
|
| 397 |
+
[gpue01] 2025-06-02 10:36:43,614 (trainer:816) INFO: 31epoch:train:801-900batch: iter_time=9.865e-05, forward_time=0.262, class_loss=1.068, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.219, accuracy=0.958, backward_time=0.938, grad_norm=79.938, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.679e-07, train_time=4.872
|
| 398 |
+
[gpue01] 2025-06-02 10:38:47,447 (trainer:816) INFO: 31epoch:train:901-1000batch: iter_time=1.059e-04, forward_time=0.263, class_loss=1.064, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.104, loss=0.218, accuracy=0.955, backward_time=0.955, grad_norm=56.857, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.646e-07, train_time=4.953
|
| 399 |
+
[gpue01] 2025-06-02 10:40:54,941 (trainer:816) INFO: 31epoch:train:1001-1100batch: iter_time=1.020e-04, forward_time=0.273, class_loss=1.107, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.227, accuracy=0.957, backward_time=0.981, grad_norm=74.898, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.612e-07, train_time=5.099
|
| 400 |
+
[gpue01] 2025-06-02 10:42:58,685 (trainer:816) INFO: 31epoch:train:1101-1200batch: iter_time=9.935e-05, forward_time=0.250, class_loss=1.028, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.211, accuracy=0.960, backward_time=0.967, grad_norm=52.780, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.579e-07, train_time=4.949
|
| 401 |
+
[gpue01] 2025-06-02 10:45:01,783 (trainer:816) INFO: 31epoch:train:1201-1300batch: iter_time=9.316e-05, forward_time=0.254, class_loss=1.400, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.285, accuracy=0.943, backward_time=0.958, grad_norm=58.518, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.546e-07, train_time=4.923
|
| 402 |
+
[gpue01] 2025-06-02 10:47:06,260 (trainer:816) INFO: 31epoch:train:1301-1400batch: iter_time=1.076e-04, forward_time=0.263, class_loss=0.926, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.190, accuracy=0.961, backward_time=0.962, grad_norm=50.230, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.514e-07, train_time=4.978
|
| 403 |
+
[gpue01] 2025-06-02 10:48:58,412 (trainer:816) INFO: 31epoch:train:1401-1500batch: iter_time=9.517e-05, forward_time=0.235, class_loss=1.449, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.295, accuracy=0.940, backward_time=0.867, grad_norm=53.965, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.481e-07, train_time=4.485
|
| 404 |
+
[gpue01] 2025-06-02 10:50:59,774 (trainer:816) INFO: 31epoch:train:1501-1600batch: iter_time=1.044e-04, forward_time=0.242, class_loss=0.961, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.197, accuracy=0.960, backward_time=0.951, grad_norm=72.314, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.450e-07, train_time=4.854
|
| 405 |
+
[gpue01] 2025-06-02 10:53:04,143 (trainer:816) INFO: 31epoch:train:1601-1700batch: iter_time=9.928e-05, forward_time=0.267, class_loss=1.200, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.245, accuracy=0.953, backward_time=0.957, grad_norm=59.742, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.418e-07, train_time=4.974
|
| 406 |
+
[gpue01] 2025-06-02 10:55:04,940 (trainer:816) INFO: 31epoch:train:1701-1800batch: iter_time=9.423e-05, forward_time=0.243, class_loss=0.887, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.183, accuracy=0.965, backward_time=0.944, grad_norm=30.131, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.387e-07, train_time=4.831
|
| 407 |
+
[gpue01] 2025-06-02 10:57:19,283 (trainer:816) INFO: 31epoch:train:1801-1900batch: iter_time=1.037e-04, forward_time=0.271, class_loss=1.390, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.283, accuracy=0.942, backward_time=1.053, grad_norm=51.783, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.356e-07, train_time=5.373
|
| 408 |
+
[gpue01] 2025-06-02 10:59:05,692 (trainer:816) INFO: 31epoch:train:1901-2000batch: iter_time=9.012e-05, forward_time=0.236, class_loss=1.287, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.263, accuracy=0.950, backward_time=0.808, grad_norm=56.740, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.325e-07, train_time=4.256
|
| 409 |
+
[gpue01] 2025-06-02 11:22:28,488 (trainer:401) INFO: 31epoch results: [train] iter_time=1.963e-04, forward_time=0.281, class_loss=1.176, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.240, accuracy=0.953, backward_time=0.922, grad_norm=61.446, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.634e-07, train_time=4.894, time=43 minutes and 5.9 seconds, total_count=62000, gpu_max_cached_mem_GB=81.803, [valid] class_loss=2.593, geo_loss_downstream=0.212, inter_geo_loss_layer32=0.019, inter_geo_loss_layer36=0.022, inter_geo_loss_layer40=0.021, inter_geo_loss_layer44=0.020, inter_geo_loss_mean=0.021, geo_loss_all=0.135, loss=2.102, accuracy=0.895, time=23 minutes and 22.56 seconds, total_count=146382, gpu_max_cached_mem_GB=81.803
|
| 410 |
+
[gpue01] 2025-06-02 11:22:41,952 (trainer:469) INFO: The best model has been updated: valid.accuracy
|
| 411 |
+
[gpue01] 2025-06-02 11:22:41,971 (trainer:523) INFO: The model files were removed: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/30epoch.pth
|
| 412 |
+
[gpue01] 2025-06-02 11:22:41,972 (trainer:335) INFO: 32/50epoch started. Estimated time to finish: 21 hours, 1 minute and 21.81 seconds
|
| 413 |
+
[gpue01] 2025-06-02 11:27:07,026 (trainer:816) INFO: 32epoch:train:1-100batch: iter_time=0.003, forward_time=0.400, class_loss=1.536, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.312, accuracy=0.938, backward_time=0.832, grad_norm=74.839, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=3.294e-07, train_time=5.039
|
| 414 |
+
[gpue01] 2025-06-02 11:29:21,087 (trainer:816) INFO: 32epoch:train:101-200batch: iter_time=9.411e-05, forward_time=0.386, class_loss=1.352, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.276, accuracy=0.945, backward_time=0.935, grad_norm=43.988, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.264e-07, train_time=5.362
|
| 415 |
+
[gpue01] 2025-06-02 11:31:30,188 (trainer:816) INFO: 32epoch:train:201-300batch: iter_time=9.077e-05, forward_time=0.347, class_loss=1.172, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.240, accuracy=0.957, backward_time=0.924, grad_norm=62.186, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.234e-07, train_time=5.163
|
| 416 |
+
[gpue01] 2025-06-02 11:33:47,390 (trainer:816) INFO: 32epoch:train:301-400batch: iter_time=1.018e-04, forward_time=0.344, class_loss=1.022, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.210, accuracy=0.957, backward_time=1.009, grad_norm=44.390, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.205e-07, train_time=5.487
|
| 417 |
+
[gpue01] 2025-06-02 11:35:49,789 (trainer:816) INFO: 32epoch:train:401-500batch: iter_time=1.054e-04, forward_time=0.297, class_loss=1.172, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.240, accuracy=0.952, backward_time=0.908, grad_norm=75.410, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.175e-07, train_time=4.895
|
| 418 |
+
[gpue01] 2025-06-02 11:37:41,933 (trainer:816) INFO: 32epoch:train:501-600batch: iter_time=9.698e-05, forward_time=0.280, class_loss=1.211, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.247, accuracy=0.953, backward_time=0.819, grad_norm=79.169, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.146e-07, train_time=4.485
|
| 419 |
+
[gpue01] 2025-06-02 11:39:46,309 (trainer:816) INFO: 32epoch:train:601-700batch: iter_time=9.339e-05, forward_time=0.293, class_loss=1.218, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.249, accuracy=0.953, backward_time=0.930, grad_norm=58.856, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.117e-07, train_time=4.974
|
| 420 |
+
[gpue01] 2025-06-02 11:41:54,707 (trainer:816) INFO: 32epoch:train:701-800batch: iter_time=1.009e-04, forward_time=0.292, class_loss=0.833, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.172, accuracy=0.968, backward_time=0.972, grad_norm=58.946, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.036, optim0_lr0=3.089e-07, train_time=5.135
|
| 421 |
+
[gpue01] 2025-06-02 11:44:02,832 (trainer:816) INFO: 32epoch:train:801-900batch: iter_time=9.829e-05, forward_time=0.264, class_loss=1.373, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.280, accuracy=0.942, backward_time=0.998, grad_norm=62.452, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.060e-07, train_time=5.124
|
| 422 |
+
[gpue01] 2025-06-02 11:45:51,409 (trainer:816) INFO: 32epoch:train:901-1000batch: iter_time=9.919e-05, forward_time=0.229, class_loss=0.924, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.012, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.190, accuracy=0.963, backward_time=0.834, grad_norm=42.125, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=3.032e-07, train_time=4.342
|
| 423 |
+
[gpue01] 2025-06-02 11:47:40,644 (trainer:816) INFO: 32epoch:train:1001-1100batch: iter_time=1.009e-04, forward_time=0.222, class_loss=0.990, geo_loss_downstream=0.165, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.203, accuracy=0.957, backward_time=0.848, grad_norm=42.052, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.004e-07, train_time=4.369
|
| 424 |
+
[gpue01] 2025-06-02 11:49:25,715 (trainer:816) INFO: 32epoch:train:1101-1200batch: iter_time=9.471e-05, forward_time=0.231, class_loss=1.161, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.237, accuracy=0.953, backward_time=0.797, grad_norm=49.309, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=2.977e-07, train_time=4.202
|
| 425 |
+
[gpue01] 2025-06-02 11:51:08,460 (trainer:816) INFO: 32epoch:train:1201-1300batch: iter_time=9.436e-05, forward_time=0.238, class_loss=0.849, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.012, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.175, accuracy=0.970, backward_time=0.768, grad_norm=53.095, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=2.950e-07, train_time=4.109
|
| 426 |
+
[gpue01] 2025-06-02 11:53:07,433 (trainer:816) INFO: 32epoch:train:1301-1400batch: iter_time=1.007e-04, forward_time=0.257, class_loss=1.168, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.239, accuracy=0.952, backward_time=0.912, grad_norm=45.273, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=2.923e-07, train_time=4.758
|
| 427 |
+
[gpue01] 2025-06-02 11:55:01,420 (trainer:816) INFO: 32epoch:train:1401-1500batch: iter_time=9.979e-05, forward_time=0.244, class_loss=1.106, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.226, accuracy=0.957, backward_time=0.875, grad_norm=36.384, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=2.896e-07, train_time=4.559
|
| 428 |
+
[gpue01] 2025-06-02 11:56:54,340 (trainer:816) INFO: 32epoch:train:1501-1600batch: iter_time=9.456e-05, forward_time=0.253, class_loss=0.688, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.012, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.012, geo_loss_all=0.102, loss=0.143, accuracy=0.973, backward_time=0.855, grad_norm=53.800, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=2.869e-07, train_time=4.516
|
| 429 |
+
[gpue01] 2025-06-02 11:58:55,292 (trainer:816) INFO: 32epoch:train:1601-1700batch: iter_time=1.007e-04, forward_time=0.249, class_loss=1.404, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.286, accuracy=0.938, backward_time=0.940, grad_norm=57.055, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=2.843e-07, train_time=4.837
|
| 430 |
+
[gpue01] 2025-06-02 12:00:48,312 (trainer:816) INFO: 32epoch:train:1701-1800batch: iter_time=1.005e-04, forward_time=0.230, class_loss=1.274, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.260, accuracy=0.943, backward_time=0.879, grad_norm=44.005, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=2.817e-07, train_time=4.520
|
| 431 |
+
[gpue01] 2025-06-02 12:02:52,056 (trainer:816) INFO: 32epoch:train:1801-1900batch: iter_time=1.037e-04, forward_time=0.257, class_loss=0.952, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.195, accuracy=0.962, backward_time=0.960, grad_norm=41.989, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=2.791e-07, train_time=4.949
|
| 432 |
+
[gpue01] 2025-06-02 12:04:49,603 (trainer:816) INFO: 32epoch:train:1901-2000batch: iter_time=1.031e-04, forward_time=0.260, class_loss=0.932, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.192, accuracy=0.965, backward_time=0.894, grad_norm=54.873, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.034, optim0_lr0=2.765e-07, train_time=4.701
|
| 433 |
+
[gpue01] 2025-06-02 12:28:09,414 (trainer:401) INFO: 32epoch results: [train] iter_time=2.404e-04, forward_time=0.279, class_loss=1.117, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.229, accuracy=0.955, backward_time=0.894, grad_norm=54.010, clip=0.000e+00, loss_scale=8.389e+06, optim_step_time=0.035, optim0_lr0=3.023e-07, train_time=4.776, time=42 minutes and 7.83 seconds, total_count=64000, gpu_max_cached_mem_GB=82.354, [valid] class_loss=2.619, geo_loss_downstream=0.221, inter_geo_loss_layer32=0.021, inter_geo_loss_layer36=0.022, inter_geo_loss_layer40=0.022, inter_geo_loss_layer44=0.022, inter_geo_loss_mean=0.022, geo_loss_all=0.141, loss=2.123, accuracy=0.895, time=23 minutes and 19.61 seconds, total_count=151104, gpu_max_cached_mem_GB=82.354
|
| 434 |
+
[gpue01] 2025-06-02 12:28:22,860 (trainer:467) INFO: There are no improvements in this epoch
|
| 435 |
+
[gpue01] 2025-06-02 12:28:22,878 (trainer:335) INFO: 33/50epoch started. Estimated time to finish: 19 hours, 51 minutes and 48 seconds
|
| 436 |
+
[gpue01] 2025-06-02 12:32:47,324 (trainer:816) INFO: 33epoch:train:1-100batch: iter_time=0.002, forward_time=0.393, class_loss=1.030, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.211, accuracy=0.962, backward_time=0.823, grad_norm=47.793, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.740e-07, train_time=4.976
|
| 437 |
+
[gpue01] 2025-06-02 12:35:01,310 (trainer:816) INFO: 33epoch:train:101-200batch: iter_time=9.706e-05, forward_time=0.376, class_loss=1.497, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.305, accuracy=0.933, backward_time=0.945, grad_norm=58.238, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.715e-07, train_time=5.359
|
| 438 |
+
[gpue01] 2025-06-02 12:37:11,584 (trainer:816) INFO: 33epoch:train:201-300batch: iter_time=1.017e-04, forward_time=0.353, class_loss=0.605, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.126, accuracy=0.980, backward_time=0.929, grad_norm=49.261, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.690e-07, train_time=5.210
|
| 439 |
+
[gpue01] 2025-06-02 12:39:22,119 (trainer:816) INFO: 33epoch:train:301-400batch: iter_time=1.060e-04, forward_time=0.338, class_loss=1.064, geo_loss_downstream=0.160, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.218, accuracy=0.960, backward_time=0.948, grad_norm=72.623, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.036, optim0_lr0=2.665e-07, train_time=5.221
|
| 440 |
+
[gpue01] 2025-06-02 12:41:30,505 (trainer:816) INFO: 33epoch:train:401-500batch: iter_time=1.061e-04, forward_time=0.305, class_loss=1.219, geo_loss_downstream=0.160, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.249, accuracy=0.950, backward_time=0.960, grad_norm=49.901, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.036, optim0_lr0=2.641e-07, train_time=5.135
|
| 441 |
+
[gpue01] 2025-06-02 12:43:31,704 (trainer:816) INFO: 33epoch:train:501-600batch: iter_time=1.046e-04, forward_time=0.294, class_loss=1.601, geo_loss_downstream=0.164, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.104, loss=0.325, accuracy=0.937, backward_time=0.898, grad_norm=77.095, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.617e-07, train_time=4.847
|
| 442 |
+
[gpue01] 2025-06-02 12:45:34,686 (trainer:816) INFO: 33epoch:train:601-700batch: iter_time=9.555e-05, forward_time=0.290, class_loss=1.288, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.263, accuracy=0.950, backward_time=0.920, grad_norm=59.251, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.593e-07, train_time=4.918
|
| 443 |
+
[gpue01] 2025-06-02 12:47:38,091 (trainer:816) INFO: 33epoch:train:701-800batch: iter_time=9.700e-05, forward_time=0.266, class_loss=1.299, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.265, accuracy=0.953, backward_time=0.948, grad_norm=62.744, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.569e-07, train_time=4.935
|
| 444 |
+
[gpue01] 2025-06-02 12:49:30,107 (trainer:816) INFO: 33epoch:train:801-900batch: iter_time=1.092e-04, forward_time=0.250, class_loss=1.356, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.276, accuracy=0.945, backward_time=0.850, grad_norm=72.159, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.545e-07, train_time=4.480
|
| 445 |
+
[gpue01] 2025-06-02 12:51:37,122 (trainer:816) INFO: 33epoch:train:901-1000batch: iter_time=1.187e-04, forward_time=0.278, class_loss=1.025, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.210, accuracy=0.962, backward_time=0.972, grad_norm=63.217, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.522e-07, train_time=5.080
|
| 446 |
+
[gpue01] 2025-06-02 12:53:44,686 (trainer:816) INFO: 33epoch:train:1001-1100batch: iter_time=1.035e-04, forward_time=0.249, class_loss=1.190, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.243, accuracy=0.955, backward_time=1.007, grad_norm=72.056, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.034, optim0_lr0=2.499e-07, train_time=5.102
|
| 447 |
+
[gpue01] 2025-06-02 12:55:38,967 (trainer:816) INFO: 33epoch:train:1101-1200batch: iter_time=1.035e-04, forward_time=0.249, class_loss=0.918, geo_loss_downstream=0.161, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.189, accuracy=0.968, backward_time=0.873, grad_norm=51.495, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.034, optim0_lr0=2.476e-07, train_time=4.570
|
| 448 |
+
[gpue01] 2025-06-02 12:57:49,673 (trainer:816) INFO: 33epoch:train:1201-1300batch: iter_time=1.087e-04, forward_time=0.276, class_loss=0.972, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.014, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.200, accuracy=0.958, backward_time=1.010, grad_norm=58.794, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.453e-07, train_time=5.227
|
| 449 |
+
[gpue01] 2025-06-02 12:59:38,197 (trainer:816) INFO: 33epoch:train:1301-1400batch: iter_time=1.066e-04, forward_time=0.231, class_loss=1.716, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.012, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.348, accuracy=0.918, backward_time=0.833, grad_norm=63.048, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.034, optim0_lr0=2.431e-07, train_time=4.340
|
| 450 |
+
[gpue01] 2025-06-02 13:01:29,801 (trainer:816) INFO: 33epoch:train:1401-1500batch: iter_time=1.127e-04, forward_time=0.251, class_loss=1.037, geo_loss_downstream=0.160, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.101, loss=0.213, accuracy=0.957, backward_time=0.845, grad_norm=49.476, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.409e-07, train_time=4.463
|
| 451 |
+
[gpue01] 2025-06-02 13:03:25,989 (trainer:816) INFO: 33epoch:train:1501-1600batch: iter_time=1.010e-04, forward_time=0.253, class_loss=1.036, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.103, loss=0.212, accuracy=0.957, backward_time=0.887, grad_norm=52.597, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.034, optim0_lr0=2.387e-07, train_time=4.647
|
| 452 |
+
[gpue01] 2025-06-02 13:05:27,481 (trainer:816) INFO: 33epoch:train:1601-1700batch: iter_time=9.690e-05, forward_time=0.251, class_loss=1.181, geo_loss_downstream=0.163, inter_geo_loss_layer32=0.015, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.241, accuracy=0.950, backward_time=0.944, grad_norm=62.438, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.365e-07, train_time=4.859
|
| 453 |
+
[gpue01] 2025-06-02 13:07:19,919 (trainer:816) INFO: 33epoch:train:1701-1800batch: iter_time=9.964e-05, forward_time=0.227, class_loss=0.954, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.196, accuracy=0.963, backward_time=0.877, grad_norm=71.114, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.343e-07, train_time=4.497
|
| 454 |
+
[gpue01] 2025-06-02 13:09:14,103 (trainer:816) INFO: 33epoch:train:1801-1900batch: iter_time=9.797e-05, forward_time=0.250, class_loss=1.066, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.013, inter_geo_loss_layer36=0.013, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.218, accuracy=0.957, backward_time=0.871, grad_norm=49.760, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.321e-07, train_time=4.567
|
| 455 |
+
[gpue01] 2025-06-02 13:11:13,214 (trainer:816) INFO: 33epoch:train:1901-2000batch: iter_time=9.810e-05, forward_time=0.248, class_loss=0.987, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.102, loss=0.203, accuracy=0.958, backward_time=0.923, grad_norm=45.787, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.034, optim0_lr0=2.300e-07, train_time=4.764
|
| 456 |
+
[gpue01] 2025-06-02 13:34:34,616 (trainer:401) INFO: 33epoch results: [train] iter_time=2.171e-04, forward_time=0.281, class_loss=1.152, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.013, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.013, geo_loss_all=0.102, loss=0.236, accuracy=0.954, backward_time=0.913, grad_norm=59.442, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.035, optim0_lr0=2.514e-07, train_time=4.860, time=42 minutes and 50.54 seconds, total_count=66000, gpu_max_cached_mem_GB=82.354, [valid] class_loss=2.631, geo_loss_downstream=0.208, inter_geo_loss_layer32=0.019, inter_geo_loss_layer36=0.021, inter_geo_loss_layer40=0.021, inter_geo_loss_layer44=0.020, inter_geo_loss_mean=0.020, geo_loss_all=0.133, loss=2.131, accuracy=0.893, time=23 minutes and 21.19 seconds, total_count=155826, gpu_max_cached_mem_GB=82.354
|
| 457 |
+
[gpue01] 2025-06-02 13:34:48,118 (trainer:467) INFO: There are no improvements in this epoch
|
| 458 |
+
[gpue01] 2025-06-02 13:34:48,138 (trainer:523) INFO: The model files were removed: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_raw/32epoch.pth
|
| 459 |
+
[gpue01] 2025-06-02 13:34:48,138 (trainer:335) INFO: 34/50epoch started. Estimated time to finish: 18 hours, 46 minutes and 18.15 seconds
|
| 460 |
+
[gpue01] 2025-06-02 13:39:13,564 (trainer:816) INFO: 34epoch:train:1-100batch: iter_time=0.003, forward_time=0.403, class_loss=1.001, geo_loss_downstream=0.162, inter_geo_loss_layer32=0.014, inter_geo_loss_layer36=0.014, inter_geo_loss_layer40=0.014, inter_geo_loss_layer44=0.013, inter_geo_loss_mean=0.014, geo_loss_all=0.103, loss=0.205, accuracy=0.962, backward_time=0.814, grad_norm=34.245, clip=0.000e+00, loss_scale=1.678e+07, optim_step_time=0.036, optim0_lr0=2.279e-07, train_time=4.988
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.4.log
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
exp_combined/lid_mms_ecapa_upcon_32_44_it0.4_shared_trainable_raw/train.log
ADDED
|
@@ -0,0 +1,388 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# python3 -m espnet2.bin.lid_train --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 2 |
+
# Started at Wed Jun 4 20:37:36 CDT 2025
|
| 3 |
+
#
|
| 4 |
+
/u/qwang20/miniconda3/envs/espnet2/bin/python3 /work/nvme/bbjs/qwang20/espnet/espnet2/bin/lid_train.py --use_preprocessor true --resume true --ignore_init_mismatch false --output_dir exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/wav.scp,speech,sound --train_data_path_and_name_and_type dump/raw/train_all_no_filter_lang/utt2spk,lid_labels,text --train_shape_file exp_all_no_filter_raw/spk_stats_16k/train/speech_shape --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/wav.scp,speech,sound --valid_data_path_and_name_and_type dump/raw/dev_ml_superb2_lang/utt2spk,lid_labels,text --spk2utt dump/raw/train_all_no_filter_lang/spk2utt --spk_num 157 --fold_length 120000 --valid_shape_file exp_all_no_filter_raw/spk_stats_16k/valid/speech_shape --config /work/nvme/bbjs/qwang20/espnet/egs2/lid_delta/lid1/conf/mms_1b_all_no_filter_balanced_dataset/mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch.yaml --use_wandb true --wandb_project lid --wandb_entity qingzhew-carnegie-mellon-university --ngpu 1 --multiprocessing_distributed True
|
| 5 |
+
/work/nvme/bbjs/qwang20/s3prl/s3prl/upstream/byol_s/byol_a/common.py:20: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
|
| 6 |
+
torchaudio.set_audio_backend("sox_io")
|
| 7 |
+
[gpue03] 2025-06-04 20:38:25,336 (abs_task:1420) INFO: pytorch.version=2.4.0+cu118, cuda.available=True, cudnn.version=90100, cudnn.benchmark=True, cudnn.deterministic=False
|
| 8 |
+
[gpue03] 2025-06-04 20:38:25,343 (abs_task:1421) INFO: Model structure:
|
| 9 |
+
ESPnetLIDUpstreamConditionModel(
|
| 10 |
+
(frontend): S3prlFrontendCondition(
|
| 11 |
+
(upstream): S3PRLUpstreamCondition(
|
| 12 |
+
(upstream): UpstreamExpertCondition(
|
| 13 |
+
(model): Wav2Vec2ModelCondition(
|
| 14 |
+
(feature_extractor): Wav2Vec2FeatureEncoder(
|
| 15 |
+
(conv_layers): ModuleList(
|
| 16 |
+
(0): Wav2Vec2LayerNormConvLayer(
|
| 17 |
+
(conv): Conv1d(1, 512, kernel_size=(10,), stride=(5,))
|
| 18 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 19 |
+
(activation): GELUActivation()
|
| 20 |
+
)
|
| 21 |
+
(1-4): 4 x Wav2Vec2LayerNormConvLayer(
|
| 22 |
+
(conv): Conv1d(512, 512, kernel_size=(3,), stride=(2,))
|
| 23 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 24 |
+
(activation): GELUActivation()
|
| 25 |
+
)
|
| 26 |
+
(5-6): 2 x Wav2Vec2LayerNormConvLayer(
|
| 27 |
+
(conv): Conv1d(512, 512, kernel_size=(2,), stride=(2,))
|
| 28 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 29 |
+
(activation): GELUActivation()
|
| 30 |
+
)
|
| 31 |
+
)
|
| 32 |
+
)
|
| 33 |
+
(feature_projection): Wav2Vec2FeatureProjection(
|
| 34 |
+
(layer_norm): LayerNorm((512,), eps=1e-05, elementwise_affine=True)
|
| 35 |
+
(projection): Linear(in_features=512, out_features=1280, bias=True)
|
| 36 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 37 |
+
)
|
| 38 |
+
(encoder): Wav2Vec2EncoderCondition(
|
| 39 |
+
(pos_conv_embed): Wav2Vec2PositionalConvEmbedding(
|
| 40 |
+
(conv): ParametrizedConv1d(
|
| 41 |
+
1280, 1280, kernel_size=(128,), stride=(1,), padding=(64,), groups=16
|
| 42 |
+
(parametrizations): ModuleDict(
|
| 43 |
+
(weight): ParametrizationList(
|
| 44 |
+
(0): _WeightNorm()
|
| 45 |
+
)
|
| 46 |
+
)
|
| 47 |
+
)
|
| 48 |
+
(padding): Wav2Vec2SamePadLayer()
|
| 49 |
+
(activation): GELUActivation()
|
| 50 |
+
)
|
| 51 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 52 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 53 |
+
(layers): ModuleList(
|
| 54 |
+
(0-47): 48 x Wav2Vec2EncoderLayerStableLayerNorm(
|
| 55 |
+
(attention): Wav2Vec2SdpaAttention(
|
| 56 |
+
(k_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 57 |
+
(v_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 58 |
+
(q_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 59 |
+
(out_proj): Linear(in_features=1280, out_features=1280, bias=True)
|
| 60 |
+
)
|
| 61 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
| 62 |
+
(layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 63 |
+
(feed_forward): Wav2Vec2FeedForward(
|
| 64 |
+
(intermediate_dropout): Dropout(p=0.0, inplace=False)
|
| 65 |
+
(intermediate_dense): Linear(in_features=1280, out_features=5120, bias=True)
|
| 66 |
+
(intermediate_act_fn): GELUActivation()
|
| 67 |
+
(output_dense): Linear(in_features=5120, out_features=1280, bias=True)
|
| 68 |
+
(output_dropout): Dropout(p=0.1, inplace=False)
|
| 69 |
+
)
|
| 70 |
+
(final_layer_norm): LayerNorm((1280,), eps=1e-05, elementwise_affine=True)
|
| 71 |
+
)
|
| 72 |
+
)
|
| 73 |
+
(ecapa_encoder): ModuleDict(
|
| 74 |
+
(32): IdentityEncoder()
|
| 75 |
+
(36): IdentityEncoder()
|
| 76 |
+
(40): IdentityEncoder()
|
| 77 |
+
(44): IdentityEncoder()
|
| 78 |
+
)
|
| 79 |
+
(pooling): ModuleDict(
|
| 80 |
+
(32): ChnAttnStatPooling(
|
| 81 |
+
(attention): Sequential(
|
| 82 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 83 |
+
(1): ReLU()
|
| 84 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 85 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 86 |
+
)
|
| 87 |
+
(softmax): Softmax(dim=2)
|
| 88 |
+
)
|
| 89 |
+
(36): ChnAttnStatPooling(
|
| 90 |
+
(attention): Sequential(
|
| 91 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 92 |
+
(1): ReLU()
|
| 93 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 94 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 95 |
+
)
|
| 96 |
+
(softmax): Softmax(dim=2)
|
| 97 |
+
)
|
| 98 |
+
(40): ChnAttnStatPooling(
|
| 99 |
+
(attention): Sequential(
|
| 100 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 101 |
+
(1): ReLU()
|
| 102 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 103 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 104 |
+
)
|
| 105 |
+
(softmax): Softmax(dim=2)
|
| 106 |
+
)
|
| 107 |
+
(44): ChnAttnStatPooling(
|
| 108 |
+
(attention): Sequential(
|
| 109 |
+
(0): Conv1d(3840, 128, kernel_size=(1,), stride=(1,))
|
| 110 |
+
(1): ReLU()
|
| 111 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 112 |
+
(3): Conv1d(128, 1280, kernel_size=(1,), stride=(1,))
|
| 113 |
+
)
|
| 114 |
+
(softmax): Softmax(dim=2)
|
| 115 |
+
)
|
| 116 |
+
)
|
| 117 |
+
(projector): ModuleDict(
|
| 118 |
+
(32): RawNet3Projector(
|
| 119 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 120 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 121 |
+
)
|
| 122 |
+
(36): RawNet3Projector(
|
| 123 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 124 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 125 |
+
)
|
| 126 |
+
(40): RawNet3Projector(
|
| 127 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 128 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 129 |
+
)
|
| 130 |
+
(44): RawNet3Projector(
|
| 131 |
+
(bn): BatchNorm1d(2560, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 132 |
+
(fc): Linear(in_features=2560, out_features=192, bias=True)
|
| 133 |
+
)
|
| 134 |
+
)
|
| 135 |
+
(lang2vec_head): ModuleDict(
|
| 136 |
+
(32): Sequential(
|
| 137 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 138 |
+
)
|
| 139 |
+
(36): Sequential(
|
| 140 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 141 |
+
)
|
| 142 |
+
(40): Sequential(
|
| 143 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 144 |
+
)
|
| 145 |
+
(44): Sequential(
|
| 146 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 147 |
+
)
|
| 148 |
+
)
|
| 149 |
+
(aamsoftmax_weight): ParameterDict()
|
| 150 |
+
(lang2vec_conditioning_projs): Linear(in_features=299, out_features=1280, bias=True)
|
| 151 |
+
(aamsoftmax_loss): AAMSoftmaxSCTopKLang2Vec(
|
| 152 |
+
(ce): CrossEntropyLoss()
|
| 153 |
+
(lang2vec_head): Sequential(
|
| 154 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 155 |
+
)
|
| 156 |
+
(lang2vec_loss): MSELoss()
|
| 157 |
+
)
|
| 158 |
+
)
|
| 159 |
+
)
|
| 160 |
+
)
|
| 161 |
+
)
|
| 162 |
+
(featurizer): Featurizer()
|
| 163 |
+
)
|
| 164 |
+
(normalize): UtteranceMVN(norm_means=True, norm_vars=False)
|
| 165 |
+
(encoder): EcapaTdnnEncoder(
|
| 166 |
+
(conv): Conv1d(1280, 512, kernel_size=(5,), stride=(1,), padding=(2,))
|
| 167 |
+
(relu): ReLU()
|
| 168 |
+
(bn): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 169 |
+
(layer1): EcapaBlock(
|
| 170 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 171 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 172 |
+
(convs): ModuleList(
|
| 173 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(2,), dilation=(2,))
|
| 174 |
+
)
|
| 175 |
+
(bns): ModuleList(
|
| 176 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 177 |
+
)
|
| 178 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 179 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 180 |
+
(relu): ReLU()
|
| 181 |
+
(se): SEModule(
|
| 182 |
+
(se): Sequential(
|
| 183 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 184 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 185 |
+
(2): ReLU()
|
| 186 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 187 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 188 |
+
(5): Sigmoid()
|
| 189 |
+
)
|
| 190 |
+
)
|
| 191 |
+
)
|
| 192 |
+
(layer2): EcapaBlock(
|
| 193 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 194 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 195 |
+
(convs): ModuleList(
|
| 196 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(3,), dilation=(3,))
|
| 197 |
+
)
|
| 198 |
+
(bns): ModuleList(
|
| 199 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 200 |
+
)
|
| 201 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 202 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 203 |
+
(relu): ReLU()
|
| 204 |
+
(se): SEModule(
|
| 205 |
+
(se): Sequential(
|
| 206 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 207 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 208 |
+
(2): ReLU()
|
| 209 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 210 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 211 |
+
(5): Sigmoid()
|
| 212 |
+
)
|
| 213 |
+
)
|
| 214 |
+
)
|
| 215 |
+
(layer3): EcapaBlock(
|
| 216 |
+
(conv1): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 217 |
+
(bn1): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 218 |
+
(convs): ModuleList(
|
| 219 |
+
(0-6): 7 x Conv1d(64, 64, kernel_size=(3,), stride=(1,), padding=(4,), dilation=(4,))
|
| 220 |
+
)
|
| 221 |
+
(bns): ModuleList(
|
| 222 |
+
(0-6): 7 x BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 223 |
+
)
|
| 224 |
+
(conv3): Conv1d(512, 512, kernel_size=(1,), stride=(1,))
|
| 225 |
+
(bn3): BatchNorm1d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 226 |
+
(relu): ReLU()
|
| 227 |
+
(se): SEModule(
|
| 228 |
+
(se): Sequential(
|
| 229 |
+
(0): AdaptiveAvgPool1d(output_size=1)
|
| 230 |
+
(1): Conv1d(512, 128, kernel_size=(1,), stride=(1,))
|
| 231 |
+
(2): ReLU()
|
| 232 |
+
(3): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 233 |
+
(4): Conv1d(128, 512, kernel_size=(1,), stride=(1,))
|
| 234 |
+
(5): Sigmoid()
|
| 235 |
+
)
|
| 236 |
+
)
|
| 237 |
+
)
|
| 238 |
+
(layer4): Conv1d(1536, 1536, kernel_size=(1,), stride=(1,))
|
| 239 |
+
(mp3): MaxPool1d(kernel_size=3, stride=3, padding=0, dilation=1, ceil_mode=False)
|
| 240 |
+
)
|
| 241 |
+
(pooling): ChnAttnStatPooling(
|
| 242 |
+
(attention): Sequential(
|
| 243 |
+
(0): Conv1d(4608, 128, kernel_size=(1,), stride=(1,))
|
| 244 |
+
(1): ReLU()
|
| 245 |
+
(2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 246 |
+
(3): Conv1d(128, 1536, kernel_size=(1,), stride=(1,))
|
| 247 |
+
)
|
| 248 |
+
(softmax): Softmax(dim=2)
|
| 249 |
+
)
|
| 250 |
+
(projector): RawNet3Projector(
|
| 251 |
+
(bn): BatchNorm1d(3072, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
|
| 252 |
+
(fc): Linear(in_features=3072, out_features=192, bias=True)
|
| 253 |
+
)
|
| 254 |
+
(loss): AAMSoftmaxSCTopKLang2Vec(
|
| 255 |
+
(ce): CrossEntropyLoss()
|
| 256 |
+
(lang2vec_head): Sequential(
|
| 257 |
+
(0): Linear(in_features=192, out_features=299, bias=True)
|
| 258 |
+
)
|
| 259 |
+
(lang2vec_loss): MSELoss()
|
| 260 |
+
)
|
| 261 |
+
)
|
| 262 |
+
|
| 263 |
+
Model summary:
|
| 264 |
+
Class Name: ESPnetLIDUpstreamConditionModel
|
| 265 |
+
Total Number of model parameters: 977.14 M
|
| 266 |
+
Number of trainable parameters: 977.14 M (100.0%)
|
| 267 |
+
Size: 3.91 GB
|
| 268 |
+
Type: torch.float32
|
| 269 |
+
[gpue03] 2025-06-04 20:38:25,343 (abs_task:1424) INFO: Optimizer:
|
| 270 |
+
Adam (
|
| 271 |
+
Parameter Group 0
|
| 272 |
+
amsgrad: False
|
| 273 |
+
betas: [0.9, 0.98]
|
| 274 |
+
capturable: False
|
| 275 |
+
differentiable: False
|
| 276 |
+
eps: 1e-08
|
| 277 |
+
foreach: None
|
| 278 |
+
fused: None
|
| 279 |
+
initial_lr: 1e-05
|
| 280 |
+
lr: 6.0032e-06
|
| 281 |
+
maximize: False
|
| 282 |
+
weight_decay: 0
|
| 283 |
+
)
|
| 284 |
+
[gpue03] 2025-06-04 20:38:25,343 (abs_task:1425) INFO: Scheduler: TristageLR(warmup_steps=1250)(hold_steps=5000)(decay_steps=6250)(init_lr_scale=0.6)(final_lr_scale=0.1)(decay_factor=0.00036841361487904725)
|
| 285 |
+
[gpue03] 2025-06-04 20:38:25,349 (abs_task:1434) INFO: Saving the configuration in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/config.yaml
|
| 286 |
+
[gpue03] 2025-06-04 20:38:25,625 (preprocessor:2245) INFO: Using lang2vec geo
|
| 287 |
+
[gpue03] 2025-06-04 20:38:41,537 (abs_task:1899) WARNING: Reading dump/raw/train_all_no_filter_lang/category2utt
|
| 288 |
+
[gpue03] 2025-06-04 20:38:41,539 (abs_task:1946) WARNING: Reading dump/raw/train_all_no_filter_lang/dataset2utt
|
| 289 |
+
[gpue03] 2025-06-04 20:38:41,540 (abs_task:1962) WARNING: Reading dump/raw/train_all_no_filter_lang/utt2dataset
|
| 290 |
+
[gpue03] 2025-06-04 20:40:59,199 (abs_task:1997) INFO: [train] dataset:
|
| 291 |
+
ESPnetDataset(
|
| 292 |
+
speech: {"path": "dump/raw/train_all_no_filter_lang/wav.scp", "type": "sound"}
|
| 293 |
+
lid_labels: {"path": "dump/raw/train_all_no_filter_lang/utt2spk", "type": "text"}
|
| 294 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 295 |
+
[gpue03] 2025-06-04 20:40:59,199 (abs_task:1998) INFO: [train] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=True, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 296 |
+
[gpue03] 2025-06-04 20:40:59,200 (abs_task:1999) INFO: [train] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 297 |
+
[gpue03] 2025-06-04 20:40:59,200 (abs_task:2000) INFO: [train] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=727460, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 298 |
+
[gpue03] 2025-06-04 20:40:59,266 (abs_task:2001) INFO: [train] mini-batch sizes summary: N-batch=727460, mean=6.0, min=1, max=6
|
| 299 |
+
[gpue03] 2025-06-04 20:40:59,684 (preprocessor:2245) INFO: Using lang2vec geo
|
| 300 |
+
[gpue03] 2025-06-04 20:41:12,217 (abs_task:1899) WARNING: Reading dump/raw/dev_ml_superb2_lang/category2utt
|
| 301 |
+
[gpue03] 2025-06-04 20:41:12,219 (abs_task:1946) WARNING: Reading dump/raw/dev_ml_superb2_lang/dataset2utt
|
| 302 |
+
[gpue03] 2025-06-04 20:41:12,221 (abs_task:1962) WARNING: Reading dump/raw/dev_ml_superb2_lang/utt2dataset
|
| 303 |
+
[gpue03] 2025-06-04 20:41:13,249 (abs_task:1997) INFO: [valid] dataset:
|
| 304 |
+
ESPnetDataset(
|
| 305 |
+
speech: {"path": "dump/raw/dev_ml_superb2_lang/wav.scp", "type": "sound"}
|
| 306 |
+
lid_labels: {"path": "dump/raw/dev_ml_superb2_lang/utt2spk", "type": "text"}
|
| 307 |
+
preprocess: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False))
|
| 308 |
+
[gpue03] 2025-06-04 20:41:13,249 (abs_task:1998) INFO: [valid] process_fn: espnet2.train.preprocessor.LIDPreprocessor(train=False, spk2utt=dump/raw/train_all_no_filter_lang/spk2utt, len(spk2label)=157, fix_duration=False)
|
| 309 |
+
[gpue03] 2025-06-04 20:41:13,249 (abs_task:1999) INFO: [valid] collate_fn: <class 'espnet2.train.collate_fn.CommonCollateFn'>(float_pad_value=0.0, int_pad_value=0.0)
|
| 310 |
+
[gpue03] 2025-06-04 20:41:13,249 (abs_task:2000) INFO: [valid] Batch sampler: CategoryPowerSamplerBalancedDataset(N-batch=4722, batch_bins=1440000, language_upsampling_factor=0.5, dataset_upsampling_factor=0.3)
|
| 311 |
+
[gpue03] 2025-06-04 20:41:13,250 (abs_task:2001) INFO: [valid] mini-batch sizes summary: N-batch=4722, mean=6.0, min=4, max=6
|
| 312 |
+
wandb: Currently logged in as: qingzhew (qingzhew-carnegie-mellon-university) to https://api.wandb.ai. Use `wandb login --relogin` to force relogin
|
| 313 |
+
wandb: Tracking run with wandb version 0.19.10
|
| 314 |
+
wandb: Run data is saved locally in exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/wandb/run-20250604_204114-htm68ys8
|
| 315 |
+
wandb: Run `wandb offline` to turn off syncing.
|
| 316 |
+
wandb: Syncing run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch
|
| 317 |
+
wandb: ⭐️ View project at https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 318 |
+
wandb: 🚀 View run at https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/htm68ys8
|
| 319 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:218: FutureWarning: `torch.cuda.amp.GradScaler(args...)` is deprecated. Please use `torch.amp.GradScaler('cuda', args...)` instead.
|
| 320 |
+
scaler = GradScaler()
|
| 321 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/train/trainer.py:159: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 322 |
+
states = torch.load(
|
| 323 |
+
[gpue03] 2025-06-04 20:41:24,303 (trainer:176) INFO: The training was resumed using exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/checkpoint.pth
|
| 324 |
+
[gpue03] 2025-06-04 20:41:24,647 (trainer:251) INFO: Frontend featurizer weights for each layer:
|
| 325 |
+
Parameter containing:
|
| 326 |
+
tensor([-0.0056, -0.0141, -0.0168, -0.0187, -0.0203, -0.0225, -0.0231, -0.0246,
|
| 327 |
+
-0.0253, -0.0252, -0.0254, -0.0241, -0.0226, -0.0200, -0.0162, -0.0120,
|
| 328 |
+
-0.0095, -0.0059, -0.0017, 0.0058, 0.0097, 0.0142, 0.0175, 0.0196,
|
| 329 |
+
0.0211, 0.0224, 0.0228, 0.0230, 0.0226, 0.0224, 0.0215, 0.0210,
|
| 330 |
+
0.0196, 0.0176, 0.0157, 0.0126, 0.0095, 0.0070, 0.0051, 0.0037,
|
| 331 |
+
0.0020, -0.0003, -0.0030, -0.0056, -0.0076, -0.0090, -0.0096, -0.0102,
|
| 332 |
+
-0.0102], device='cuda:0', requires_grad=True)
|
| 333 |
+
[gpue03] 2025-06-04 20:41:24,648 (trainer:267) INFO: Error: 'Linear' object is not subscriptable
|
| 334 |
+
[gpue03] 2025-06-04 20:41:24,648 (trainer:272) INFO: cos_mp: 1.0
|
| 335 |
+
[gpue03] 2025-06-04 20:41:24,648 (trainer:273) INFO: easy_margin: False
|
| 336 |
+
[gpue03] 2025-06-04 20:41:24,648 (trainer:281) WARNING: The training has already reached at max_epoch: 34
|
| 337 |
+
[gpue03] 2025-06-04 20:41:24,659 (trainer:541) INFO: The training was finished at 33 epochs
|
| 338 |
+
[gpue03] 2025-06-04 20:41:24,660 (average_nbest_models:69) INFO: Averaging 2best models: criterion="valid.accuracy": exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/valid.accuracy.ave_2best.pth
|
| 339 |
+
/work/nvme/bbjs/qwang20/espnet/espnet2/main_funcs/average_nbest_models.py:77: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
|
| 340 |
+
_loaded[e] = torch.load(
|
| 341 |
+
[gpue03] 2025-06-04 20:41:30,224 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.32.attention.2.num_batches_tracked instead of averaging
|
| 342 |
+
[gpue03] 2025-06-04 20:41:30,225 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.36.attention.2.num_batches_tracked instead of averaging
|
| 343 |
+
[gpue03] 2025-06-04 20:41:30,225 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.40.attention.2.num_batches_tracked instead of averaging
|
| 344 |
+
[gpue03] 2025-06-04 20:41:30,226 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.pooling.44.attention.2.num_batches_tracked instead of averaging
|
| 345 |
+
[gpue03] 2025-06-04 20:41:30,227 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.32.bn.num_batches_tracked instead of averaging
|
| 346 |
+
[gpue03] 2025-06-04 20:41:30,227 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.36.bn.num_batches_tracked instead of averaging
|
| 347 |
+
[gpue03] 2025-06-04 20:41:30,227 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.40.bn.num_batches_tracked instead of averaging
|
| 348 |
+
[gpue03] 2025-06-04 20:41:30,228 (average_nbest_models:96) INFO: Accumulating frontend.upstream.upstream.model.encoder.projector.44.bn.num_batches_tracked instead of averaging
|
| 349 |
+
[gpue03] 2025-06-04 20:41:30,230 (average_nbest_models:96) INFO: Accumulating encoder.bn.num_batches_tracked instead of averaging
|
| 350 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bn1.num_batches_tracked instead of averaging
|
| 351 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.0.num_batches_tracked instead of averaging
|
| 352 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.1.num_batches_tracked instead of averaging
|
| 353 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.2.num_batches_tracked instead of averaging
|
| 354 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.3.num_batches_tracked instead of averaging
|
| 355 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.4.num_batches_tracked instead of averaging
|
| 356 |
+
[gpue03] 2025-06-04 20:41:30,231 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.5.num_batches_tracked instead of averaging
|
| 357 |
+
[gpue03] 2025-06-04 20:41:30,232 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bns.6.num_batches_tracked instead of averaging
|
| 358 |
+
[gpue03] 2025-06-04 20:41:30,232 (average_nbest_models:96) INFO: Accumulating encoder.layer1.bn3.num_batches_tracked instead of averaging
|
| 359 |
+
[gpue03] 2025-06-04 20:41:30,232 (average_nbest_models:96) INFO: Accumulating encoder.layer1.se.se.3.num_batches_tracked instead of averaging
|
| 360 |
+
[gpue03] 2025-06-04 20:41:30,232 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bn1.num_batches_tracked instead of averaging
|
| 361 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.0.num_batches_tracked instead of averaging
|
| 362 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.1.num_batches_tracked instead of averaging
|
| 363 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.2.num_batches_tracked instead of averaging
|
| 364 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.3.num_batches_tracked instead of averaging
|
| 365 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.4.num_batches_tracked instead of averaging
|
| 366 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.5.num_batches_tracked instead of averaging
|
| 367 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bns.6.num_batches_tracked instead of averaging
|
| 368 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.bn3.num_batches_tracked instead of averaging
|
| 369 |
+
[gpue03] 2025-06-04 20:41:30,233 (average_nbest_models:96) INFO: Accumulating encoder.layer2.se.se.3.num_batches_tracked instead of averaging
|
| 370 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bn1.num_batches_tracked instead of averaging
|
| 371 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.0.num_batches_tracked instead of averaging
|
| 372 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.1.num_batches_tracked instead of averaging
|
| 373 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.2.num_batches_tracked instead of averaging
|
| 374 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.3.num_batches_tracked instead of averaging
|
| 375 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.4.num_batches_tracked instead of averaging
|
| 376 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.5.num_batches_tracked instead of averaging
|
| 377 |
+
[gpue03] 2025-06-04 20:41:30,234 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bns.6.num_batches_tracked instead of averaging
|
| 378 |
+
[gpue03] 2025-06-04 20:41:30,235 (average_nbest_models:96) INFO: Accumulating encoder.layer3.bn3.num_batches_tracked instead of averaging
|
| 379 |
+
[gpue03] 2025-06-04 20:41:30,235 (average_nbest_models:96) INFO: Accumulating encoder.layer3.se.se.3.num_batches_tracked instead of averaging
|
| 380 |
+
[gpue03] 2025-06-04 20:41:30,237 (average_nbest_models:96) INFO: Accumulating pooling.attention.2.num_batches_tracked instead of averaging
|
| 381 |
+
[gpue03] 2025-06-04 20:41:30,237 (average_nbest_models:96) INFO: Accumulating projector.bn.num_batches_tracked instead of averaging
|
| 382 |
+
wandb:
|
| 383 |
+
wandb: 🚀 View run mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch at: https://wandb.ai/qingzhew-carnegie-mellon-university/lid/runs/htm68ys8
|
| 384 |
+
wandb: ⭐️ View project at: https://wandb.ai/qingzhew-carnegie-mellon-university/lid
|
| 385 |
+
wandb: Synced 5 W&B file(s), 0 media file(s), 0 artifact file(s) and 0 other file(s)
|
| 386 |
+
wandb: Find logs at: exp_all_no_filter_raw/spk_mms_ecapa_upcon_32_44_it0.4_sharedCondProj_butUpdate_50k_lr1e-5_datasetup0.3_backup_33epoch_raw/wandb/run-20250604_204114-htm68ys8/logs
|
| 387 |
+
# Accounting: time=240 threads=1
|
| 388 |
+
# Ended (code 0) at Wed Jun 4 20:41:36 CDT 2025, elapsed time 240 seconds
|