bztxb commited on
Commit
a5d9ade
·
1 Parent(s): 0ff8b34

Update model checkpoint and configs

Browse files
Files changed (7) hide show
  1. .gitattributes +1 -0
  2. README.md +55 -3
  3. config.json +425 -0
  4. label_map.json +198 -0
  5. model.safetensors +3 -0
  6. tokenizer.json +0 -0
  7. 用法示例.jpg +3 -0
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ 用法示例.jpg filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -1,3 +1,55 @@
1
- ---
2
- license: cc-by-nc-sa-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-sa-4.0
3
+ ---
4
+
5
+ # 明实录与清实录多标签分类推理模型
6
+
7
+ 本模型用于对《明实录》和《清实录》文本进行多标签分类推理。基于[Jihuai/bert-ancient-chinese](https://huggingface.co/Jihuai/bert-ancient-chinese)进行任务微调,利用公开语料进行预训练,得到适合实录类型的预训练模型[shiluBERT](https://huggingface.co/bztxb/shiluBERT)。
8
+
9
+ ## 中文说明
10
+
11
+ ### 模型与数据来源
12
+
13
+ - 训练数据来源:[《朝鲜王朝实录》](https://sillok.history.go.kr);
14
+ - 任务类型:多标签文本分类;
15
+ - 训练样本数:约27万。
16
+
17
+ ### 评估指标
18
+
19
+ | 指标 | 数值 |
20
+ |---|---|
21
+ | Sample F1 | 0.7246 |
22
+ | Sample Precision | 0.7594 |
23
+ | Sample Recall | 0.7321 |
24
+ | LRAP | 0.8074 |
25
+ | Hamming Loss | 0.0069 |
26
+
27
+ ### 示例使用方法
28
+
29
+ - 在线体验 Space: [bztxb/shiluInfer](https://huggingface.co/spaces/bztxb/shiluInfer)
30
+ ![Space 使用示例](用法示例.jpg)
31
+
32
+ ## English Version
33
+
34
+ This model performs multi-label classification inference on texts of VERITABLE RECORDS of the Ming/Qing DYNASTY. It is fine-tuned from [Jihuai/bert-ancient-chinese](https://huggingface.co/Jihuai/bert-ancient-chinese), and further benefits from pretraining on public corpora to obtain a Shilu-oriented pretrained model, [shiluBERT](https://huggingface.co/bztxb/shiluBERT).
35
+
36
+ ### Model and Data Sources
37
+
38
+ - Training data source: [VERITABLE RECORDS of the JOSEON DYNASTY](https://sillok.history.go.kr).
39
+ - Task type: multi-label text classification.
40
+ - Number of training samples: approximately 0.27 million.
41
+
42
+ ### Evaluation Metrics
43
+
44
+ | Metric | Value |
45
+ |---|---|
46
+ | Sample F1 | 0.7246 |
47
+ | Sample Precision | 0.7594 |
48
+ | Sample Recall | 0.7321 |
49
+ | LRAP | 0.8074 |
50
+ | Hamming Loss | 0.0069 |
51
+
52
+ ### Example Usage
53
+
54
+ - Try the online Space: [bztxb/shiluInfer](https://huggingface.co/spaces/bztxb/shiluInfer)
55
+ ![Space Usage Example](用法示例.jpg)
config.json ADDED
@@ -0,0 +1,425 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertForSequenceClassification"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "directionality": "bidi",
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0",
13
+ "1": "LABEL_1",
14
+ "2": "LABEL_2",
15
+ "3": "LABEL_3",
16
+ "4": "LABEL_4",
17
+ "5": "LABEL_5",
18
+ "6": "LABEL_6",
19
+ "7": "LABEL_7",
20
+ "8": "LABEL_8",
21
+ "9": "LABEL_9",
22
+ "10": "LABEL_10",
23
+ "11": "LABEL_11",
24
+ "12": "LABEL_12",
25
+ "13": "LABEL_13",
26
+ "14": "LABEL_14",
27
+ "15": "LABEL_15",
28
+ "16": "LABEL_16",
29
+ "17": "LABEL_17",
30
+ "18": "LABEL_18",
31
+ "19": "LABEL_19",
32
+ "20": "LABEL_20",
33
+ "21": "LABEL_21",
34
+ "22": "LABEL_22",
35
+ "23": "LABEL_23",
36
+ "24": "LABEL_24",
37
+ "25": "LABEL_25",
38
+ "26": "LABEL_26",
39
+ "27": "LABEL_27",
40
+ "28": "LABEL_28",
41
+ "29": "LABEL_29",
42
+ "30": "LABEL_30",
43
+ "31": "LABEL_31",
44
+ "32": "LABEL_32",
45
+ "33": "LABEL_33",
46
+ "34": "LABEL_34",
47
+ "35": "LABEL_35",
48
+ "36": "LABEL_36",
49
+ "37": "LABEL_37",
50
+ "38": "LABEL_38",
51
+ "39": "LABEL_39",
52
+ "40": "LABEL_40",
53
+ "41": "LABEL_41",
54
+ "42": "LABEL_42",
55
+ "43": "LABEL_43",
56
+ "44": "LABEL_44",
57
+ "45": "LABEL_45",
58
+ "46": "LABEL_46",
59
+ "47": "LABEL_47",
60
+ "48": "LABEL_48",
61
+ "49": "LABEL_49",
62
+ "50": "LABEL_50",
63
+ "51": "LABEL_51",
64
+ "52": "LABEL_52",
65
+ "53": "LABEL_53",
66
+ "54": "LABEL_54",
67
+ "55": "LABEL_55",
68
+ "56": "LABEL_56",
69
+ "57": "LABEL_57",
70
+ "58": "LABEL_58",
71
+ "59": "LABEL_59",
72
+ "60": "LABEL_60",
73
+ "61": "LABEL_61",
74
+ "62": "LABEL_62",
75
+ "63": "LABEL_63",
76
+ "64": "LABEL_64",
77
+ "65": "LABEL_65",
78
+ "66": "LABEL_66",
79
+ "67": "LABEL_67",
80
+ "68": "LABEL_68",
81
+ "69": "LABEL_69",
82
+ "70": "LABEL_70",
83
+ "71": "LABEL_71",
84
+ "72": "LABEL_72",
85
+ "73": "LABEL_73",
86
+ "74": "LABEL_74",
87
+ "75": "LABEL_75",
88
+ "76": "LABEL_76",
89
+ "77": "LABEL_77",
90
+ "78": "LABEL_78",
91
+ "79": "LABEL_79",
92
+ "80": "LABEL_80",
93
+ "81": "LABEL_81",
94
+ "82": "LABEL_82",
95
+ "83": "LABEL_83",
96
+ "84": "LABEL_84",
97
+ "85": "LABEL_85",
98
+ "86": "LABEL_86",
99
+ "87": "LABEL_87",
100
+ "88": "LABEL_88",
101
+ "89": "LABEL_89",
102
+ "90": "LABEL_90",
103
+ "91": "LABEL_91",
104
+ "92": "LABEL_92",
105
+ "93": "LABEL_93",
106
+ "94": "LABEL_94",
107
+ "95": "LABEL_95",
108
+ "96": "LABEL_96",
109
+ "97": "LABEL_97",
110
+ "98": "LABEL_98",
111
+ "99": "LABEL_99",
112
+ "100": "LABEL_100",
113
+ "101": "LABEL_101",
114
+ "102": "LABEL_102",
115
+ "103": "LABEL_103",
116
+ "104": "LABEL_104",
117
+ "105": "LABEL_105",
118
+ "106": "LABEL_106",
119
+ "107": "LABEL_107",
120
+ "108": "LABEL_108",
121
+ "109": "LABEL_109",
122
+ "110": "LABEL_110",
123
+ "111": "LABEL_111",
124
+ "112": "LABEL_112",
125
+ "113": "LABEL_113",
126
+ "114": "LABEL_114",
127
+ "115": "LABEL_115",
128
+ "116": "LABEL_116",
129
+ "117": "LABEL_117",
130
+ "118": "LABEL_118",
131
+ "119": "LABEL_119",
132
+ "120": "LABEL_120",
133
+ "121": "LABEL_121",
134
+ "122": "LABEL_122",
135
+ "123": "LABEL_123",
136
+ "124": "LABEL_124",
137
+ "125": "LABEL_125",
138
+ "126": "LABEL_126",
139
+ "127": "LABEL_127",
140
+ "128": "LABEL_128",
141
+ "129": "LABEL_129",
142
+ "130": "LABEL_130",
143
+ "131": "LABEL_131",
144
+ "132": "LABEL_132",
145
+ "133": "LABEL_133",
146
+ "134": "LABEL_134",
147
+ "135": "LABEL_135",
148
+ "136": "LABEL_136",
149
+ "137": "LABEL_137",
150
+ "138": "LABEL_138",
151
+ "139": "LABEL_139",
152
+ "140": "LABEL_140",
153
+ "141": "LABEL_141",
154
+ "142": "LABEL_142",
155
+ "143": "LABEL_143",
156
+ "144": "LABEL_144",
157
+ "145": "LABEL_145",
158
+ "146": "LABEL_146",
159
+ "147": "LABEL_147",
160
+ "148": "LABEL_148",
161
+ "149": "LABEL_149",
162
+ "150": "LABEL_150",
163
+ "151": "LABEL_151",
164
+ "152": "LABEL_152",
165
+ "153": "LABEL_153",
166
+ "154": "LABEL_154",
167
+ "155": "LABEL_155",
168
+ "156": "LABEL_156",
169
+ "157": "LABEL_157",
170
+ "158": "LABEL_158",
171
+ "159": "LABEL_159",
172
+ "160": "LABEL_160",
173
+ "161": "LABEL_161",
174
+ "162": "LABEL_162",
175
+ "163": "LABEL_163",
176
+ "164": "LABEL_164",
177
+ "165": "LABEL_165",
178
+ "166": "LABEL_166",
179
+ "167": "LABEL_167",
180
+ "168": "LABEL_168",
181
+ "169": "LABEL_169",
182
+ "170": "LABEL_170",
183
+ "171": "LABEL_171",
184
+ "172": "LABEL_172",
185
+ "173": "LABEL_173",
186
+ "174": "LABEL_174",
187
+ "175": "LABEL_175",
188
+ "176": "LABEL_176",
189
+ "177": "LABEL_177",
190
+ "178": "LABEL_178",
191
+ "179": "LABEL_179",
192
+ "180": "LABEL_180",
193
+ "181": "LABEL_181",
194
+ "182": "LABEL_182",
195
+ "183": "LABEL_183",
196
+ "184": "LABEL_184",
197
+ "185": "LABEL_185",
198
+ "186": "LABEL_186",
199
+ "187": "LABEL_187",
200
+ "188": "LABEL_188",
201
+ "189": "LABEL_189",
202
+ "190": "LABEL_190",
203
+ "191": "LABEL_191",
204
+ "192": "LABEL_192",
205
+ "193": "LABEL_193"
206
+ },
207
+ "initializer_range": 0.02,
208
+ "intermediate_size": 3072,
209
+ "label2id": {
210
+ "LABEL_0": 0,
211
+ "LABEL_1": 1,
212
+ "LABEL_10": 10,
213
+ "LABEL_100": 100,
214
+ "LABEL_101": 101,
215
+ "LABEL_102": 102,
216
+ "LABEL_103": 103,
217
+ "LABEL_104": 104,
218
+ "LABEL_105": 105,
219
+ "LABEL_106": 106,
220
+ "LABEL_107": 107,
221
+ "LABEL_108": 108,
222
+ "LABEL_109": 109,
223
+ "LABEL_11": 11,
224
+ "LABEL_110": 110,
225
+ "LABEL_111": 111,
226
+ "LABEL_112": 112,
227
+ "LABEL_113": 113,
228
+ "LABEL_114": 114,
229
+ "LABEL_115": 115,
230
+ "LABEL_116": 116,
231
+ "LABEL_117": 117,
232
+ "LABEL_118": 118,
233
+ "LABEL_119": 119,
234
+ "LABEL_12": 12,
235
+ "LABEL_120": 120,
236
+ "LABEL_121": 121,
237
+ "LABEL_122": 122,
238
+ "LABEL_123": 123,
239
+ "LABEL_124": 124,
240
+ "LABEL_125": 125,
241
+ "LABEL_126": 126,
242
+ "LABEL_127": 127,
243
+ "LABEL_128": 128,
244
+ "LABEL_129": 129,
245
+ "LABEL_13": 13,
246
+ "LABEL_130": 130,
247
+ "LABEL_131": 131,
248
+ "LABEL_132": 132,
249
+ "LABEL_133": 133,
250
+ "LABEL_134": 134,
251
+ "LABEL_135": 135,
252
+ "LABEL_136": 136,
253
+ "LABEL_137": 137,
254
+ "LABEL_138": 138,
255
+ "LABEL_139": 139,
256
+ "LABEL_14": 14,
257
+ "LABEL_140": 140,
258
+ "LABEL_141": 141,
259
+ "LABEL_142": 142,
260
+ "LABEL_143": 143,
261
+ "LABEL_144": 144,
262
+ "LABEL_145": 145,
263
+ "LABEL_146": 146,
264
+ "LABEL_147": 147,
265
+ "LABEL_148": 148,
266
+ "LABEL_149": 149,
267
+ "LABEL_15": 15,
268
+ "LABEL_150": 150,
269
+ "LABEL_151": 151,
270
+ "LABEL_152": 152,
271
+ "LABEL_153": 153,
272
+ "LABEL_154": 154,
273
+ "LABEL_155": 155,
274
+ "LABEL_156": 156,
275
+ "LABEL_157": 157,
276
+ "LABEL_158": 158,
277
+ "LABEL_159": 159,
278
+ "LABEL_16": 16,
279
+ "LABEL_160": 160,
280
+ "LABEL_161": 161,
281
+ "LABEL_162": 162,
282
+ "LABEL_163": 163,
283
+ "LABEL_164": 164,
284
+ "LABEL_165": 165,
285
+ "LABEL_166": 166,
286
+ "LABEL_167": 167,
287
+ "LABEL_168": 168,
288
+ "LABEL_169": 169,
289
+ "LABEL_17": 17,
290
+ "LABEL_170": 170,
291
+ "LABEL_171": 171,
292
+ "LABEL_172": 172,
293
+ "LABEL_173": 173,
294
+ "LABEL_174": 174,
295
+ "LABEL_175": 175,
296
+ "LABEL_176": 176,
297
+ "LABEL_177": 177,
298
+ "LABEL_178": 178,
299
+ "LABEL_179": 179,
300
+ "LABEL_18": 18,
301
+ "LABEL_180": 180,
302
+ "LABEL_181": 181,
303
+ "LABEL_182": 182,
304
+ "LABEL_183": 183,
305
+ "LABEL_184": 184,
306
+ "LABEL_185": 185,
307
+ "LABEL_186": 186,
308
+ "LABEL_187": 187,
309
+ "LABEL_188": 188,
310
+ "LABEL_189": 189,
311
+ "LABEL_19": 19,
312
+ "LABEL_190": 190,
313
+ "LABEL_191": 191,
314
+ "LABEL_192": 192,
315
+ "LABEL_193": 193,
316
+ "LABEL_2": 2,
317
+ "LABEL_20": 20,
318
+ "LABEL_21": 21,
319
+ "LABEL_22": 22,
320
+ "LABEL_23": 23,
321
+ "LABEL_24": 24,
322
+ "LABEL_25": 25,
323
+ "LABEL_26": 26,
324
+ "LABEL_27": 27,
325
+ "LABEL_28": 28,
326
+ "LABEL_29": 29,
327
+ "LABEL_3": 3,
328
+ "LABEL_30": 30,
329
+ "LABEL_31": 31,
330
+ "LABEL_32": 32,
331
+ "LABEL_33": 33,
332
+ "LABEL_34": 34,
333
+ "LABEL_35": 35,
334
+ "LABEL_36": 36,
335
+ "LABEL_37": 37,
336
+ "LABEL_38": 38,
337
+ "LABEL_39": 39,
338
+ "LABEL_4": 4,
339
+ "LABEL_40": 40,
340
+ "LABEL_41": 41,
341
+ "LABEL_42": 42,
342
+ "LABEL_43": 43,
343
+ "LABEL_44": 44,
344
+ "LABEL_45": 45,
345
+ "LABEL_46": 46,
346
+ "LABEL_47": 47,
347
+ "LABEL_48": 48,
348
+ "LABEL_49": 49,
349
+ "LABEL_5": 5,
350
+ "LABEL_50": 50,
351
+ "LABEL_51": 51,
352
+ "LABEL_52": 52,
353
+ "LABEL_53": 53,
354
+ "LABEL_54": 54,
355
+ "LABEL_55": 55,
356
+ "LABEL_56": 56,
357
+ "LABEL_57": 57,
358
+ "LABEL_58": 58,
359
+ "LABEL_59": 59,
360
+ "LABEL_6": 6,
361
+ "LABEL_60": 60,
362
+ "LABEL_61": 61,
363
+ "LABEL_62": 62,
364
+ "LABEL_63": 63,
365
+ "LABEL_64": 64,
366
+ "LABEL_65": 65,
367
+ "LABEL_66": 66,
368
+ "LABEL_67": 67,
369
+ "LABEL_68": 68,
370
+ "LABEL_69": 69,
371
+ "LABEL_7": 7,
372
+ "LABEL_70": 70,
373
+ "LABEL_71": 71,
374
+ "LABEL_72": 72,
375
+ "LABEL_73": 73,
376
+ "LABEL_74": 74,
377
+ "LABEL_75": 75,
378
+ "LABEL_76": 76,
379
+ "LABEL_77": 77,
380
+ "LABEL_78": 78,
381
+ "LABEL_79": 79,
382
+ "LABEL_8": 8,
383
+ "LABEL_80": 80,
384
+ "LABEL_81": 81,
385
+ "LABEL_82": 82,
386
+ "LABEL_83": 83,
387
+ "LABEL_84": 84,
388
+ "LABEL_85": 85,
389
+ "LABEL_86": 86,
390
+ "LABEL_87": 87,
391
+ "LABEL_88": 88,
392
+ "LABEL_89": 89,
393
+ "LABEL_9": 9,
394
+ "LABEL_90": 90,
395
+ "LABEL_91": 91,
396
+ "LABEL_92": 92,
397
+ "LABEL_93": 93,
398
+ "LABEL_94": 94,
399
+ "LABEL_95": 95,
400
+ "LABEL_96": 96,
401
+ "LABEL_97": 97,
402
+ "LABEL_98": 98,
403
+ "LABEL_99": 99
404
+ },
405
+ "layer_norm_eps": 1e-12,
406
+ "lstm_dropout_prob": 0.5,
407
+ "lstm_embedding_size": 768,
408
+ "max_position_embeddings": 512,
409
+ "model_type": "bert",
410
+ "num_attention_heads": 12,
411
+ "num_hidden_layers": 12,
412
+ "pad_token_id": 0,
413
+ "pooler_fc_size": 768,
414
+ "pooler_num_attention_heads": 12,
415
+ "pooler_num_fc_layers": 3,
416
+ "pooler_size_per_head": 128,
417
+ "pooler_type": "first_token_transform",
418
+ "position_embedding_type": "absolute",
419
+ "problem_type": "multi_label_classification",
420
+ "torch_dtype": "float32",
421
+ "transformers_version": "4.53.2",
422
+ "type_vocab_size": 2,
423
+ "use_cache": true,
424
+ "vocab_size": 38208
425
+ }
label_map.json ADDED
@@ -0,0 +1,198 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "labels": [
3
+ "上供",
4
+ "中人",
5
+ "中央亞",
6
+ "中央行政",
7
+ "中央軍",
8
+ "主副食",
9
+ "交通",
10
+ "人事",
11
+ "人文敎育",
12
+ "人物",
13
+ "任免",
14
+ "住生活",
15
+ "佛敎",
16
+ "保健",
17
+ "倉庫",
18
+ "倫理",
19
+ "倭",
20
+ "儀式",
21
+ "儒學",
22
+ "元",
23
+ "兩班",
24
+ "兵法",
25
+ "兵站",
26
+ "其他",
27
+ "出版",
28
+ "前史",
29
+ "勸農",
30
+ "化學",
31
+ "匠人",
32
+ "印刷",
33
+ "史學",
34
+ "司法",
35
+ "商人",
36
+ "商品",
37
+ "商業",
38
+ "嗜好食品",
39
+ "器皿祭物",
40
+ "國王",
41
+ "國用",
42
+ "土俗信仰",
43
+ "土地賣買",
44
+ "土木",
45
+ "地學",
46
+ "地方自治",
47
+ "地方行政",
48
+ "地方軍",
49
+ "外交",
50
+ "天氣",
51
+ "契",
52
+ "妃嬪",
53
+ "姓名",
54
+ "宅地",
55
+ "宗社",
56
+ "宗親",
57
+ "官廳手工",
58
+ "官服",
59
+ "宮官",
60
+ "宴會",
61
+ "家具",
62
+ "家屋",
63
+ "家族",
64
+ "家産",
65
+ "專賣",
66
+ "工業",
67
+ "市場",
68
+ "常服",
69
+ "常民",
70
+ "度量衡",
71
+ "建築",
72
+ "建設",
73
+ "彈劾",
74
+ "役",
75
+ "思想",
76
+ "恤兵",
77
+ "戰爭",
78
+ "戶口",
79
+ "戶籍",
80
+ "手工業品",
81
+ "手數料",
82
+ "技術敎育",
83
+ "採鑛",
84
+ "政論",
85
+ "政變",
86
+ "故事",
87
+ "敎育",
88
+ "救恤",
89
+ "數學",
90
+ "文學",
91
+ "明",
92
+ "曆法",
93
+ "書冊",
94
+ "東南亞",
95
+ "東學",
96
+ "林業",
97
+ "果樹園藝",
98
+ "歐美",
99
+ "歷史",
100
+ "殖利",
101
+ "民亂",
102
+ "水利",
103
+ "水産業",
104
+ "水運",
105
+ "治安",
106
+ "法制",
107
+ "漁業",
108
+ "演劇",
109
+ "物價",
110
+ "物理",
111
+ "特殊敎育",
112
+ "特殊軍",
113
+ "特用作物",
114
+ "獸醫學",
115
+ "王室",
116
+ "琉球",
117
+ "生物",
118
+ "田制",
119
+ "田稅",
120
+ "畜産",
121
+ "社會紀綱",
122
+ "禁火",
123
+ "禮俗",
124
+ "禮服",
125
+ "私營手工",
126
+ "科學",
127
+ "移動",
128
+ "管理",
129
+ "經營形態",
130
+ "經筵",
131
+ "綱常",
132
+ "綿作",
133
+ "編史",
134
+ "美術",
135
+ "聚落",
136
+ "舞踊",
137
+ "藝術",
138
+ "藥學",
139
+ "行刑",
140
+ "行幸",
141
+ "行政",
142
+ "衣生活",
143
+ "裁判",
144
+ "裝身具",
145
+ "製鍊",
146
+ "西學",
147
+ "親族",
148
+ "語學",
149
+ "語文學",
150
+ "諫諍",
151
+ "變亂",
152
+ "財政",
153
+ "貢物",
154
+ "貨幣",
155
+ "貿易",
156
+ "賃貸",
157
+ "賃金",
158
+ "賜給",
159
+ "賤人",
160
+ "赴防",
161
+ "身分",
162
+ "身分變動",
163
+ "身良役賤",
164
+ "軍事",
165
+ "軍器",
166
+ "軍役",
167
+ "軍政",
168
+ "軍資",
169
+ "農作",
170
+ "農村手工",
171
+ "農業",
172
+ "農業技術",
173
+ "通信",
174
+ "進上",
175
+ "運賃",
176
+ "道敎",
177
+ "選拔",
178
+ "鄕村",
179
+ "酒類",
180
+ "醫學",
181
+ "醫藥",
182
+ "野",
183
+ "量田",
184
+ "金融",
185
+ "鑛山",
186
+ "鑛業",
187
+ "開墾",
188
+ "關防",
189
+ "陸運",
190
+ "雜稅",
191
+ "音樂",
192
+ "風俗",
193
+ "食生活",
194
+ "養蠶",
195
+ "馬政",
196
+ "鹽業"
197
+ ]
198
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f4ccff14c7aa1f762b804988d54ab7a2382853e1a96d91cc78286d734554618d
3
+ size 462160648
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
用法示例.jpg ADDED

Git LFS Details

  • SHA256: 19e3e2a382dbf6b8a2c01933b28a3eae9ad5c63b907f41f47da7c2d97e91338d
  • Pointer size: 131 Bytes
  • Size of remote file: 221 kB