Masik001 commited on
Commit
156c422
·
1 Parent(s): 1bc95bb

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +156 -0
README.md ADDED
@@ -0,0 +1,156 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ===== Application Startup at 2023-07-10 13:56:11 =====
2
+
3
+ 2023-07-10 17:36:30 | INFO | faiss.loader | Loading faiss with AVX2 support.
4
+ 2023-07-10 17:36:30 | INFO | faiss.loader | Successfully loaded faiss with AVX2 support.
5
+ 没有发现支持的N卡, 使用CPU进行推理
6
+ 2023-07-10 17:36:31 | INFO | fairseq.tasks.hubert_pretraining | current directory is /home/user/app
7
+ 2023-07-10 17:36:31 | INFO | fairseq.tasks.hubert_pretraining | HubertPretrainingTask Config {'_name': 'hubert_pretraining', 'data': 'metadata', 'fine_tuning': False, 'labels': ['km'], 'label_dir': 'label', 'label_rate': 50.0, 'sample_rate': 16000, 'normalize': False, 'enable_padding': False, 'max_keep_size': None, 'max_sample_size': 250000, 'min_sample_size': 32000, 'single_target': False, 'random_crop': True, 'pad_audio': False}
8
+ 2023-07-10 17:36:31 | INFO | fairseq.models.hubert.hubert | HubertModel Config: {'_name': 'hubert', 'label_rate': 50.0, 'extractor_mode': default, 'encoder_layers': 12, 'encoder_embed_dim': 768, 'encoder_ffn_embed_dim': 3072, 'encoder_attention_heads': 12, 'activation_fn': gelu, 'layer_type': transformer, 'dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'encoder_layerdrop': 0.05, 'dropout_input': 0.1, 'dropout_features': 0.1, 'final_dim': 256, 'untie_final_proj': True, 'layer_norm_first': False, 'conv_feature_layers': '[(512,10,5)] + [(512,3,2)] * 4 + [(512,2,2)] * 2', 'conv_bias': False, 'logit_temp': 0.1, 'target_glu': False, 'feature_grad_mult': 0.1, 'mask_length': 10, 'mask_prob': 0.8, 'mask_selection': static, 'mask_other': 0.0, 'no_mask_overlap': False, 'mask_min_space': 1, 'mask_channel_length': 10, 'mask_channel_prob': 0.0, 'mask_channel_selection': static, 'mask_channel_other': 0.0, 'no_mask_channel_overlap': False, 'mask_channel_min_space': 1, 'conv_pos': 128, 'conv_pos_groups': 16, 'latent_temp': [2.0, 0.5, 0.999995], 'skip_masked': False, 'skip_nomask': False, 'checkpoint_activations': False, 'required_seq_len_multiple': 2, 'depthwise_conv_kernel_size': 31, 'attn_type': '', 'pos_enc_type': 'abs', 'fp16': False}
9
+ gin_channels: 256 self.spk_embed_dim: 109
10
+ <All keys matched successfully>
11
+ Model loaded: aether-jp / added_IVF865_Flat_nprobe_1_aether-jp_v2.index | (V2)
12
+ gin_channels: 256 self.spk_embed_dim: 109
13
+ <All keys matched successfully>
14
+ Model loaded: albedo-jp / added_IVF641_Flat_nprobe_1_albedo-jp_v1.index | (V1)
15
+ gin_channels: 256 self.spk_embed_dim: 109
16
+ <All keys matched successfully>
17
+ Model loaded: alhaitham-jp / added_IVF519_Flat_nprobe_1.index | (V1)
18
+ gin_channels: 256 self.spk_embed_dim: 109
19
+ <All keys matched successfully>
20
+ Model loaded: ayaka-jp / added_IVF1018_Flat_nprobe_1_ayaka_v2.index | (V2)
21
+ gin_channels: 256 self.spk_embed_dim: 109
22
+ <All keys matched successfully>
23
+ Model loaded: ayato-jp / added_IVF1304_Flat_nprobe_1.index | (V1)
24
+ gin_channels: 256 self.spk_embed_dim: 109
25
+ <All keys matched successfully>
26
+ Model loaded: barbara-jp / added_IVF548_Flat_nprobe_1.index | (V1)
27
+ gin_channels: 256 self.spk_embed_dim: 109
28
+ <All keys matched successfully>
29
+ Model loaded: charlotte-jp / added_IVF1318_Flat_nprobe_1_charlotte-jp_v2_400.index | (V2)
30
+ gin_channels: 256 self.spk_embed_dim: 109
31
+ <All keys matched successfully>
32
+ Model loaded: childe-jp / added_IVF684_Flat_nprobe_1_childe-v2_v2.index | (V2)
33
+ gin_channels: 256 self.spk_embed_dim: 109
34
+ <All keys matched successfully>
35
+ Model loaded: chongyun-jp / added_IVF545_Flat_nprobe_1.index | (V1)
36
+ gin_channels: 256 self.spk_embed_dim: 109
37
+ <All keys matched successfully>
38
+ Model loaded: cyno-jp / added_IVF380_Flat_nprobe_1_cyno-jp_v1.index | (V1)
39
+ gin_channels: 256 self.spk_embed_dim: 109
40
+ <All keys matched successfully>
41
+ Model loaded: diluc-jp / added_IVF1511_Flat_nprobe_1.index | (V1)
42
+ gin_channels: 256 self.spk_embed_dim: 109
43
+ <All keys matched successfully>
44
+ Model loaded: eula-jp / added_IVF2219_Flat_nprobe_1.index | (V1)
45
+ gin_channels: 256 self.spk_embed_dim: 109
46
+ <All keys matched successfully>
47
+ Model loaded: faruzan-jp / added_IVF256_Flat_nprobe_1_faruzan-jp_v2.index | (V2)
48
+ gin_channels: 256 self.spk_embed_dim: 109
49
+ <All keys matched successfully>
50
+ Model loaded: fischl-jp / added_IVF1225_Flat_nprobe_1.index | (V1)
51
+ gin_channels: 256 self.spk_embed_dim: 109
52
+ <All keys matched successfully>
53
+ Model loaded: ganyu-jp / added_IVF1636_Flat_nprobe_1.index | (V1)
54
+ gin_channels: 256 self.spk_embed_dim: 109
55
+ <All keys matched successfully>
56
+ Model loaded: heizou-jp / added_IVF466_Flat_nprobe_1_heizou-jp_v1.index | (V1)
57
+ gin_channels: 256 self.spk_embed_dim: 109
58
+ <All keys matched successfully>
59
+ Model loaded: hutao-jp / added_IVF265_Flat_nprobe_5.index | (V1)
60
+ gin_channels: 256 self.spk_embed_dim: 109
61
+ <All keys matched successfully>
62
+ Model loaded: itto-jp / added_IVF4454_Flat_nprobe_1_itto-jp_v2.index | (V2)
63
+ gin_channels: 256 self.spk_embed_dim: 109
64
+ <All keys matched successfully>
65
+ Model loaded: kaeya-jp / added_IVF1655_Flat_nprobe_1.index | (V1)
66
+ gin_channels: 256 self.spk_embed_dim: 109
67
+ <All keys matched successfully>
68
+ Model loaded: kaveh-jp / added_IVF613_Flat_nprobe_1_kaveh_v2_v2.index | (V2)
69
+ gin_channels: 256 self.spk_embed_dim: 109
70
+ <All keys matched successfully>
71
+ Model loaded: kazuha-jp / added_IVF860_Flat_nprobe_1_kazuha_v2.index | (V2)
72
+ gin_channels: 256 self.spk_embed_dim: 109
73
+ <All keys matched successfully>
74
+ Model loaded: keqing-jp / added_IVF1634_Flat_nprobe_1.index | (V1)
75
+ gin_channels: 256 self.spk_embed_dim: 109
76
+ <All keys matched successfully>
77
+ Model loaded: kirara-jp / added_IVF672_Flat_nprobe_1.index | (V1)
78
+ gin_channels: 256 self.spk_embed_dim: 109
79
+ <All keys matched successfully>
80
+ Model loaded: klee-jp / added_IVF282_Flat_nprobe_5.index | (V1)
81
+ gin_channels: 256 self.spk_embed_dim: 109
82
+ <All keys matched successfully>
83
+ Model loaded: kokomi-jp / added_IVF934_Flat_nprobe_1_kokomi_v2.index | (V2)
84
+ gin_channels: 256 self.spk_embed_dim: 109
85
+ <All keys matched successfully>
86
+ Model loaded: lumine-jp / added_IVF938_Flat_nprobe_1_lumine-jp_v2.index | (V2)
87
+ gin_channels: 256 self.spk_embed_dim: 109
88
+ <All keys matched successfully>
89
+ Model loaded: mona-jp / added_IVF2165_Flat_nprobe_1.index | (V1)
90
+ gin_channels: 256 self.spk_embed_dim: 109
91
+ <All keys matched successfully>
92
+ Model loaded: nahida-jp / added_IVF1062_Flat_nprobe_1_nahida-v2_v2.index | (V2)
93
+ gin_channels: 256 self.spk_embed_dim: 109
94
+ <All keys matched successfully>
95
+ Model loaded: nilou-jp / added_IVF218_Flat_nprobe_1.index | (V1)
96
+ gin_channels: 256 self.spk_embed_dim: 109
97
+ <All keys matched successfully>
98
+ Model loaded: paimon-jp / added_IVF3904_Flat_nprobe_1_paimon-jp_v2.index | (V2)
99
+ gin_channels: 256 self.spk_embed_dim: 109
100
+ <All keys matched successfully>
101
+ Model loaded: raiden-jp / added_IVF4256_Flat_nprobe_1_raiden-jp_v2.index | (V2)
102
+ gin_channels: 256 self.spk_embed_dim: 109
103
+ <All keys matched successfully>
104
+ Model loaded: signora-jp / added_IVF478_Flat_nprobe_1_signora-jp_v2.index | (V2)
105
+ gin_channels: 256 self.spk_embed_dim: 109
106
+ <All keys matched successfully>
107
+ Model loaded: sucrose-jp / added_IVF884_Flat_nprobe_1.index | (V1)
108
+ gin_channels: 256 self.spk_embed_dim: 109
109
+ <All keys matched successfully>
110
+ Model loaded: thoma-jp / added_IVF366_Flat_nprobe_1.index | (V1)
111
+ gin_channels: 256 self.spk_embed_dim: 109
112
+ <All keys matched successfully>
113
+ Model loaded: tighnari-jp / added_IVF446_Flat_nprobe_1_tignari-jp_v1.index | (V1)
114
+ gin_channels: 256 self.spk_embed_dim: 109
115
+ <All keys matched successfully>
116
+ Model loaded: venti-jp / added_IVF3591_Flat_nprobe_1_venti-jp_v2.index | (V2)
117
+ gin_channels: 256 self.spk_embed_dim: 109
118
+ <All keys matched successfully>
119
+ Model loaded: wanderer-jp / added_IVF953_Flat_nprobe_1_wanderer-v2_v2.index | (V2)
120
+ gin_channels: 256 self.spk_embed_dim: 109
121
+ <All keys matched successfully>
122
+ Model loaded: xiao-jp / added_IVF3205_Flat_nprobe_1_xiao-jp_v2.index | (V2)
123
+ gin_channels: 256 self.spk_embed_dim: 109
124
+ <All keys matched successfully>
125
+ Model loaded: yae-jp / added_IVF1097_Flat_nprobe_1_yae-v2_v2.index | (V2)
126
+ gin_channels: 256 self.spk_embed_dim: 109
127
+ <All keys matched successfully>
128
+ Model loaded: yanfei-jp / added_IVF1271_Flat_nprobe_1_yanfei-v2_v2.index | (V2)
129
+ gin_channels: 256 self.spk_embed_dim: 109
130
+ <All keys matched successfully>
131
+ Model loaded: yelan-jp / added_IVF2051_Flat_nprobe_1.index | (V1)
132
+ gin_channels: 256 self.spk_embed_dim: 109
133
+ <All keys matched successfully>
134
+ Model loaded: yoimiya-jp / added_IVF2034_Flat_nprobe_1.index | (V1)
135
+ gin_channels: 256 self.spk_embed_dim: 109
136
+ <All keys matched successfully>
137
+ Model loaded: zhongli-jp / added_IVF1672_Flat_nprobe_1.index | (V1)
138
+ Running on local URL: http://0.0.0.0:7860
139
+
140
+ To create a public link, set `share=True` in `launch()`.
141
+ [2023-07-10 17:37]: npy: 2.0945026874542236, f0: 0.05994224548339844s, infer: 17.599822521209717s
142
+ [2023-07-10 17:38]: npy: 3.1487624645233154, f0: 0.022048234939575195s, infer: 25.596487760543823s
143
+ [2023-07-10 17:39]: npy: 3.693798780441284, f0: 0.017490386962890625s, infer: 32.087180376052856s
144
+ [2023-07-10 17:39]: npy: 2.5506346225738525, f0: 0.013794660568237305s, infer: 26.60752511024475s
145
+ [2023-07-10 17:40]: npy: 2.6092371940612793, f0: 0.03858685493469238s, infer: 26.312453031539917s
146
+ [2023-07-10 17:41]: npy: 2.615102767944336, f0: 0.03931307792663574s, infer: 26.40330672264099s
147
+ [2023-07-10 17:43]: npy: 3.1028923988342285, f0: 0.05546903610229492s, infer: 32.91775321960449s
148
+ [2023-07-10 17:44]: npy: 2.839845657348633, f0: 0.046269893646240234s, infer: 27.98230767250061s
149
+ [2023-07-10 17:44]: npy: 3.3039710521698, f0: 0.020084142684936523s, infer: 29.59837293624878s
150
+ [2023-07-10 17:45]: npy: 3.30319881439209, f0: 0.03941464424133301s, infer: 32.42077875137329s
151
+ [2023-07-10 17:46]: npy: 2.90372371673584, f0: 0.0513463020324707s, infer: 28.517998695373535s
152
+ [2023-07-10 17:47]: npy: 3.4118876457214355, f0: 0.10508394241333008s, infer: 31.312357664108276s
153
+ [2023-07-10 17:47]: npy: 4.102552890777588, f0: 0.02527928352355957s, infer: 33.81402325630188s
154
+ [2023-07-10 17:48]: npy: 2.4004595279693604, f0: 0.09933662414550781s, infer: 29.89732074737549s
155
+ [2023-07-10 17:49]: npy: 3.2991466522216797, f0: 0.03225088119506836s, infer: 29.510783195495605s
156
+ [2023-07-10 17:49]: npy: 3.4149115085601807, f0: 0.04070758819580078s, infer: 30.8032488822937s