| | --- |
| | language: |
| | - zh |
| | license: "apache-2.0" |
| | --- |
| | |
| | ## Chinese-MobileBERT |
| | > The original [Chinese-MobileBERT](https://github.com/ymcui/Chinese-MobileBERT) repository does not provide pytorch weights, here the weights are converted via the [model_convert](https://github.com/CycloneBoy/model_convert) repository. |
| |
|
| | This repository is developed based on:https://github.com/ymcui/Chinese-MobileBERT |
| |
|
| | You may also be interested in, |
| | - Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm |
| | - Chinese MacBERT: https://github.com/ymcui/MacBERT |
| | - Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA |
| | - Chinese XLNet: https://github.com/ymcui/Chinese-XLNet |
| | - Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer |
| |
|
| | More resources by HFL: https://github.com/ymcui/HFL-Anthology |
| |
|
| | ## Citation |
| | If you find the technical report or resource is useful, please cite the following technical report in your paper. |
| |
|
| |
|
| | ``` |
| | @misc{cui-2022-chinese-mobilebert, |
| | title={Chinese MobileBERT}, |
| | author={Cui, Yiming}, |
| | howpublished={\url{https://github.com/ymcui/Chinese-MobileBERT}}, |
| | year={2022} |
| | } |
| | ``` |