File size: 620 Bytes
dfbdea2 daadcab |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
---
license: mit
datasets:
- PolyAI/banking77
language:
- en
base_model:
- google-bert/bert-base-uncased
metrics:
- f1
---
# BERT finetuned on Banking 77 Open Intent Classification Dataset
This is a BERT model finetuned on the [PolyAI/banking77](https://huggingface.co/datasets/PolyAI/banking77) dataset.
## Training Configuration
* PRETRAINED_MODEL_NAME = "bert-base-uncased"
* BATCH_SIZE = 128
* LR_PRETRAIN = 2e-5
* EPOCHS_PRETRAIN = 20
* DEVICE = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
* MAX_LEN = 128, to truncate long sequences down to 128 tokens, or pad short sequences up to 128 tokens
|