# Bert Online Discussions (bert-web-discussions-en) This model is a fine-tuned version of the [BERT base model](https://huggingface.co/bert-base-uncased). It was introduced in [this paper](https://aclanthology.org/2022.acl-long.379/). ## Model description The BERT base language model was fine-tuned on the [Webis-CMV-20 corpus](https://zenodo.org/record/3778298#.YxB-HC223RZ) and on the [args.me corpus](https://zenodo.org/record/3734893#.YxB-NC223RY). The model was trained on a sample of 2,469,026 sentences in total.