Transformers
PyTorch
bert
bert-base-mrpc / README.md
GeneZC's picture
Update README.md
8e67b52
---
license: apache-2.0
datasets:
- glue
---
# Model Details
`bert-base-uncased` finetuned on `MRPC`.
## Parameter settings
batch size is 16, learning rate is 3e-5.
## Metrics
acc: 0.8775, f1: 0.9120