BERT-based-MRC

This model is a fine-tuned version of bert-base-chinese on BD-11 dataset.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 18
  • optimizer: Adam with epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Evaluated on the BD-11 dataset test set.

  "exact": 87.70,
  "f1": 89.32,
  "total": 1528,
  "HasAns_exact": 83.94,
  "HasAns_f1": 86.24,
  "HasAns_total": 1077,
  "NoAns_exact": 96.67,
  "NoAns_f1": 96.67,
  "NoAns_total": 451

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.6.0
  • Datasets 2.7.1
  • Tokenizers 0.11.6
Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for DaydreamerF/BERT-based-MRC

Finetuned
(237)
this model