Transformers
PyTorch
bert
bert-large-mrpc / README.md
GeneZC's picture
Update README.md
28c7d1c
metadata
license: apache-2.0
datasets:
  - glue

Model Details

bert-large-uncased finetuned on MRPC.

Parameter settings

batch size is 16, learning rate is 3e-5.

Metrics

acc: 0.8922, f1: 0.9225