Test-Model

This model is a fine-tuned version of meta-llama/Llama-3.1-8B, and is specialized for document classification. It achieves the following results on the evaluation set:

  • Accuracy: 0.924647
  • Hit rate: 0.977464

Training Parameters

  • Epochs: 5
  • Batch Size: 20
  • Threshold: 0.975
  • Token Length: 512
  • Data Set Size: 10000
  • Evenly Distributed: True
  • Municipalities: Heroey, Kongsberg, Boemlo, Luster, Maalselv
  • Learning Rate: 0.0001
  • Weight Decay: 0.01
  • Eval Accumulation Steps: 4

Evaluation Results

Label Accuracy Accuracy Accumulation Hit Rate Hit Rate Accumulation
0 0.93 0.93 0.98 0.98
1 0.95 0.95 0.97 0.97
2 0.91 0.91 0.98 0.98
3 0.40 0.40 0.57 0.57
4 0.25 0.25 0.63 0.63
5 0.86 0.86 0.94 0.94
6 0.99 0.99 1.04 1.04
7 0.94 0.94 0.98 0.98
8 0.67 0.67 0.86 0.86
9 0.99 0.99 0.99 0.99
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support