metadata
license: mit
datasets:
- PolyAI/banking77
language:
- en
base_model:
- google-bert/bert-base-uncased
metrics:
- f1
BERT finetuned on Banking 77 Open Intent Classification Dataset
This is a BERT model finetuned on the PolyAI/banking77 dataset.
Training Configuration
- PRETRAINED_MODEL_NAME = "bert-base-uncased"
- BATCH_SIZE = 128
- LR_PRETRAIN = 2e-5
- EPOCHS_PRETRAIN = 20
- DEVICE = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
- MAX_LEN = 128, to truncate long sequences down to 128 tokens, or pad short sequences up to 128 tokens