YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

BERT Mean Pooling Classifier for RTE

This model is a fine-tuned BERT model that uses mean pooling instead of the CLS token for the Recognizing Textual Entailment (RTE) task.

Model description

Instead of using the conventional approach of taking the CLS token representation, this model computes the mean of all token representations in the sequence, weighted by the attention mask to handle padding properly.

Hyperparameters

  • Learning rate: 1.1056980156506841e-05
  • Number of epochs: 4
  • Max sequence length: 64
  • Dropout rate: 0.2
  • Batch size: 32

Performance

  • Accuracy: 0.5596
  • Macro F1: 0.5516
  • Precision: 0.5564
  • Recall: 0.5540
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support