How to use GeneZC/bert-base-cola with Transformers:
# Load model directly from transformers import AutoTokenizer, BertCls tokenizer = AutoTokenizer.from_pretrained("GeneZC/bert-base-cola") model = BertCls.from_pretrained("GeneZC/bert-base-cola")
bert-base-uncased finetuned on CoLA.
bert-base-uncased
CoLA
batch size is 32, learning rate is 2e-5.
matthews_corr: 0.6295