| license: mit | |
| arxiv: 2302.04026 | |
| pipeline_tag: fill-mask | |
| tags: | |
| - code | |
| # C-BERT MLM | |
| ## Exploring Software Naturalness through Neural Language Models | |
| ## Overview | |
| This model is the unofficial HuggingFace version of "[C-BERT](http://arxiv.org/abs/2302.04026)" with just the masked language modeling head for pretraining. The weights come from "[An Empirical Comparison of Pre-Trained Models of Source Code](http://arxiv.org/abs/2302.04026)". Please cite the authors if you use this in an academic setting. |