File size: 509 Bytes
3293e34
 
 
 
 
 
 
 
 
 
 
 
bf2b9e4
1
2
3
4
5
6
7
8
9
10
11
12
13
---
license: mit
arxiv: 2302.04026
pipeline_tag: fill-mask
tags:
- code
---

# C-BERT MLM
## Exploring Software Naturalness through Neural Language Models

## Overview
This model is the unofficial HuggingFace version of "[C-BERT](http://arxiv.org/abs/2302.04026)" with just the masked language modeling head for pretraining. The weights come from "[An Empirical Comparison of Pre-Trained Models of Source Code](http://arxiv.org/abs/2302.04026)". Please cite the authors if you use this in an academic setting.