aps/super_glue
Viewer • Updated • 196k • 193k • 188
How to use msintaha/bert-base-uncased-copa-kb-27 with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForMultipleChoice
tokenizer = AutoTokenizer.from_pretrained("msintaha/bert-base-uncased-copa-kb-27")
model = AutoModelForMultipleChoice.from_pretrained("msintaha/bert-base-uncased-copa-kb-27")This model is a fine-tuned version of bert-base-uncased on the super_glue dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|---|---|---|---|---|
| No log | 1.0 | 40 | 0.6534 | 0.7400 |
| No log | 2.0 | 80 | 0.6114 | 0.7100 |