File size: 1,898 Bytes
a5d9ade e00ab4a c310848 a5d9ade e00ab4a c310848 a5d9ade | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 | ---
license: cc-by-nc-sa-4.0
---
# 明实录与清实录多标签分类推理模型
本模型用于对《明实录》和《清实录》文本进行多标签分类推理。基于[Jihuai/bert-ancient-chinese](https://huggingface.co/Jihuai/bert-ancient-chinese)进行任务微调,利用公开语料进行预训练,得到适合实录类型的预训练模型[shiluBERT](https://huggingface.co/bztxb/shiluBERT)。
## 中文说明
### 模型与数据来源
- 训练数据来源:[《朝鲜王朝实录》](https://sillok.history.go.kr);
- 任务类型:多标签文本分类;
- 训练样本数:约27万。
### 评估指标
| 指标 | 数值 |
|---|---|
| Sample F1 | 0.7209 |
| Sample Precision | 0.7527 |
| Sample Recall | 0.7306 |
| LRAP | 0.8048 |
| Hamming Loss | 0.0070 |
### 示例使用方法
- 在线体验 Space: [bztxb/shiluInfer](https://huggingface.co/spaces/bztxb/shiluInfer)

## English Version
This model performs multi-label classification inference on texts of VERITABLE RECORDS of the Ming/Qing DYNASTY. It is fine-tuned from [Jihuai/bert-ancient-chinese](https://huggingface.co/Jihuai/bert-ancient-chinese), and further benefits from pretraining on public corpora to obtain a Shilu-oriented pretrained model, [shiluBERT](https://huggingface.co/bztxb/shiluBERT).
### Model and Data Sources
- Training data source: [VERITABLE RECORDS of the JOSEON DYNASTY](https://sillok.history.go.kr).
- Task type: multi-label text classification.
- Number of training samples: approximately 0.27 million.
### Evaluation Metrics
| Metric | Value |
|---|---|
| Sample F1 | 0.7209 |
| Sample Precision | 0.7527 |
| Sample Recall | 0.7306 |
| LRAP | 0.8048 |
| Hamming Loss | 0.0070 |
### Example Usage
- Try the online Space: [bztxb/shiluInfer](https://huggingface.co/spaces/bztxb/shiluInfer)

|