Instructions to use GeneZC/bert-large-mrpc with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use GeneZC/bert-large-mrpc with Transformers:
# Load model directly from transformers import AutoTokenizer, BertCls tokenizer = AutoTokenizer.from_pretrained("GeneZC/bert-large-mrpc") model = BertCls.from_pretrained("GeneZC/bert-large-mrpc") - Notebooks
- Google Colab
- Kaggle
| license: apache-2.0 | |
| datasets: | |
| - glue | |
| # Model Details | |
| `bert-large-uncased` finetuned on `MRPC`. | |
| ## Parameter settings | |
| batch size is 16, learning rate is 3e-5. | |
| ## Metrics | |
| acc: 0.8922, f1: 0.9225 |