Commit ·
101dcd8
1
Parent(s): b203abe
Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,8 @@
|
|
| 1 |
---
|
| 2 |
license: afl-3.0
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: afl-3.0
|
| 3 |
---
|
| 4 |
+
RoBERTa-base-ch 模型是哈工大讯飞联合实验室(HFL)开源的RoBERTa-wwm-ext的中文base 版。
|
| 5 |
+
RoBERTa-wwm-ext 中文模型是基于 RoBERTa 用全词Mask 方法预训练出的模型。
|
| 6 |
+
|
| 7 |
+
The RoBERTa-base-ch model is the chinese version of RoBERTa-wwm-ext which is open sourced by the Harbin Institute of Technology Xunfei Lab (HFL).
|
| 8 |
+
RoBERTa-wwm-ext chinese model is pre-trained based on the RoBERTa model with whole word mask which is proposed by Yiming Cui Wanxiang Che Ting Liu Bing Qin Ziqing Yang.
|