XiaoEnn commited on
Commit
fb3ff3e
·
verified ·
1 Parent(s): 696b48b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -3
README.md CHANGED
@@ -1,3 +1,9 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+ ### intrudcution
5
+ Herberta Pretrain model experimental research model developed by the Angelpro Team, focused on Development of a pre-training model for herbal medicine.Based on the chinese-roberta-wwm-ext-large model, we do the MLM task to complete the pre-training model on the data of 675 ancient books and 32 Chinese medicine textbooks, which we named herberta, where we take the front and back words of herb and Roberta and splice them together. We are committed to make a contribution to the TCM big modeling industry.
6
+ We hope it can be used:
7
+ - Encoder for Herbal Formulas, Embedding Models
8
+ - Word Embedding Model for Chinese Medicine Domain Data
9
+ - Support for a wide range of downstream TCM tasks, e.g., classification tasks, labeling tasks, etc.