How to use GeneZC/bert-base-mrpc with Transformers:
# Load model directly from transformers import AutoTokenizer, BertCls tokenizer = AutoTokenizer.from_pretrained("GeneZC/bert-base-mrpc") model = BertCls.from_pretrained("GeneZC/bert-base-mrpc")
bert-base-uncased finetuned on MRPC.
bert-base-uncased
MRPC
batch size is 16, learning rate is 3e-5.
acc: 0.8775, f1: 0.9120
# Load model directly from transformers import AutoTokenizer, BertCls tokenizer = AutoTokenizer.from_pretrained("GeneZC/bert-base-mrpc") model = BertCls.from_pretrained("GeneZC/bert-base-mrpc")