Transformers
PyTorch
bert
bert-base-cola / README.md
GeneZC's picture
Update README.md
3349de0
metadata
license: apache-2.0
datasets:
  - glue

Model Details

bert-base-uncased finetuned on CoLA.

Parameter settings

batch size is 32, learning rate is 2e-5.

Metrics

matthews_corr: 0.6295