ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding
Paper
• 2010.12148 • Published
ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding
More detail: https://arxiv.org/abs/2010.12148
| Model Name | Language | Model Structure |
|---|---|---|
| ernie-gram-chinese | Chinese | Layer:12, Hidden:768, Heads:12 |
This released Pytorch model is converted from the officially released PaddlePaddle ERNIE model and a series of experiments have been conducted to check the accuracy of the conversion.
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("swtx/ernie-gram-chinese")
model = AutoModel.from_pretrained("swtx/ernie-gram-chinese")