XiaoEnn commited on
Commit
a0f25b3
·
verified ·
1 Parent(s): 8a67d41

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -14
README.md CHANGED
@@ -1,17 +1,4 @@
1
- ---
2
- tags:
3
- - pretrain_model
4
- - transformers
5
- - TCM
6
- - herberta
7
- license: apache-2.0
8
- inference: true
9
- language:
10
- - aa
11
- base_model:
12
- - hfl/chinese-roberta-wwm-ext
13
- library_name: transformers
14
- ---
15
  ### intrudcution
16
  Herberta Pretrain model experimental research model developed by the Angelpro Team, focused on Development of a pre-training model for herbal medicine.Based on the chinese-roberta-wwm-ext-large model, we do the MLM task to complete the pre-training model on the data of 675 ancient books and 32 Chinese medicine textbooks, which we named herberta, where we take the front and back words of herb and Roberta and splice them together. We are committed to make a contribution to the TCM big modeling industry.
17
  We hope it can be used:
 
1
+
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  ### intrudcution
3
  Herberta Pretrain model experimental research model developed by the Angelpro Team, focused on Development of a pre-training model for herbal medicine.Based on the chinese-roberta-wwm-ext-large model, we do the MLM task to complete the pre-training model on the data of 675 ancient books and 32 Chinese medicine textbooks, which we named herberta, where we take the front and back words of herb and Roberta and splice them together. We are committed to make a contribution to the TCM big modeling industry.
4
  We hope it can be used: