Regression_electra_2

This model is a fine-tuned version of google/electra-small-generator on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 5.2032
  • Mse: 5.2032
  • Mae: 1.8032
  • R2: -1.6847
  • Accuracy: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Mse Mae R2 Accuracy
No log 1.0 1 4.3426 4.3426 1.8114 -3.0746 0.1429
No log 2.0 2 4.2398 4.2398 1.7830 -2.9781 0.1429
No log 3.0 3 4.1436 4.1436 1.7561 -2.8878 0.1429
No log 4.0 4 4.0529 4.0529 1.7305 -2.8028 0.2857
No log 5.0 5 3.9672 3.9672 1.7059 -2.7223 0.2857
No log 6.0 6 3.8864 3.8864 1.6825 -2.6465 0.2857
No log 7.0 7 3.8097 3.8097 1.6600 -2.5746 0.2857
No log 8.0 8 3.7376 3.7376 1.6386 -2.5070 0.2857
No log 9.0 9 3.6706 3.6706 1.6184 -2.4441 0.2857
No log 10.0 10 3.6084 3.6084 1.5995 -2.3857 0.2857
No log 11.0 11 3.5517 3.5517 1.5820 -2.3325 0.2857
No log 12.0 12 3.5000 3.5000 1.5659 -2.2840 0.2857
No log 13.0 13 3.4536 3.4536 1.5513 -2.2405 0.2857
No log 14.0 14 3.4127 3.4127 1.5382 -2.2020 0.2857
No log 15.0 15 3.3771 3.3771 1.5268 -2.1687 0.2857
No log 16.0 16 3.3474 3.3474 1.5171 -2.1408 0.2857
No log 17.0 17 3.3234 3.3234 1.5093 -2.1183 0.2857
No log 18.0 18 3.3052 3.3052 1.5034 -2.1012 0.2857
No log 19.0 19 3.2930 3.2930 1.4994 -2.0898 0.2857
No log 20.0 20 3.2869 3.2869 1.4974 -2.0840 0.2857

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1+cu116
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support