How to use GeneZC/bert-large-sst2 with Transformers:
# Load model directly from transformers import AutoTokenizer, BertCls tokenizer = AutoTokenizer.from_pretrained("GeneZC/bert-large-sst2") model = BertCls.from_pretrained("GeneZC/bert-large-sst2")
bert-large-uncased finetuned on SST-2.
bert-large-uncased
SST-2
batch size is 16, learning rate is 1e-5.
acc: 0.9392