Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
|
| 2 |
### intrudcution
|
| 3 |
Herberta Pretrain model experimental research model developed by the Angelpro Team, focused on Development of a pre-training model for herbal medicine.Based on the chinese-roberta-wwm-ext-large model, we do the MLM task to complete the pre-training model on the data of 675 ancient books and 32 Chinese medicine textbooks, which we named herberta, where we take the front and back words of herb and Roberta and splice them together. We are committed to make a contribution to the TCM big modeling industry.
|
|
|
|
| 1 |
+
---
|
| 2 |
+
tags:
|
| 3 |
+
- Pretrain_Model
|
| 4 |
+
- transformers
|
| 5 |
+
- TCM
|
| 6 |
+
- herberta
|
| 7 |
+
- text embeddding
|
| 8 |
+
license: apache-2.0
|
| 9 |
+
inference: true
|
| 10 |
+
language:
|
| 11 |
+
- zh
|
| 12 |
+
- en
|
| 13 |
+
base_model:
|
| 14 |
+
- hfl/chinese-roberta-wwm-ext
|
| 15 |
+
library_name: transformers
|
| 16 |
+
---
|
| 17 |
|
| 18 |
### intrudcution
|
| 19 |
Herberta Pretrain model experimental research model developed by the Angelpro Team, focused on Development of a pre-training model for herbal medicine.Based on the chinese-roberta-wwm-ext-large model, we do the MLM task to complete the pre-training model on the data of 675 ancient books and 32 Chinese medicine textbooks, which we named herberta, where we take the front and back words of herb and Roberta and splice them together. We are committed to make a contribution to the TCM big modeling industry.
|