ARC-Challenge_Llama-3.2-1B-2k4vxbc8
This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.9242
- Model Preparation Time: 0.0058
- Mdl: 1261.4202
- Accumulated Loss: 874.3499
- Correct Preds: 110.0
- Total Preds: 299.0
- Accuracy: 0.3679
- Correct Gen Preds: 106.0
- Gen Accuracy: 0.3545
- Correct Gen Preds 32: 15.0
- Correct Preds 32: 18.0
- Total Labels 32: 64.0
- Accuracy 32: 0.2812
- Gen Accuracy 32: 0.2344
- Correct Gen Preds 33: 28.0
- Correct Preds 33: 28.0
- Total Labels 33: 73.0
- Accuracy 33: 0.3836
- Gen Accuracy 33: 0.3836
- Correct Gen Preds 34: 39.0
- Correct Preds 34: 40.0
- Total Labels 34: 78.0
- Accuracy 34: 0.5128
- Gen Accuracy 34: 0.5
- Correct Gen Preds 35: 24.0
- Correct Preds 35: 24.0
- Total Labels 35: 83.0
- Accuracy 35: 0.2892
- Gen Accuracy 35: 0.2892
- Correct Gen Preds 36: 0.0
- Correct Preds 36: 0.0
- Total Labels 36: 1.0
- Accuracy 36: 0.0
- Gen Accuracy 36: 0.0
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 112
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.01
- num_epochs: 100
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Mdl | Accumulated Loss | Correct Preds | Total Preds | Accuracy | Correct Gen Preds | Gen Accuracy | Correct Gen Preds 32 | Correct Preds 32 | Total Labels 32 | Accuracy 32 | Gen Accuracy 32 | Correct Gen Preds 33 | Correct Preds 33 | Total Labels 33 | Accuracy 33 | Gen Accuracy 33 | Correct Gen Preds 34 | Correct Preds 34 | Total Labels 34 | Accuracy 34 | Gen Accuracy 34 | Correct Gen Preds 35 | Correct Preds 35 | Total Labels 35 | Accuracy 35 | Gen Accuracy 35 | Correct Gen Preds 36 | Correct Preds 36 | Total Labels 36 | Accuracy 36 | Gen Accuracy 36 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 1.6389 | 0.0058 | 706.9523 | 490.0220 | 66.0 | 299.0 | 0.2207 | 66.0 | 0.2207 | 62.0 | 62.0 | 64.0 | 0.9688 | 0.9688 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 4.0 | 4.0 | 78.0 | 0.0513 | 0.0513 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.9641 | 1.0 | 2 | 1.9883 | 0.0058 | 857.7052 | 594.5160 | 64.0 | 299.0 | 0.2140 | 64.0 | 0.2140 | 64.0 | 64.0 | 64.0 | 1.0 | 1.0 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 0.0 | 0.0 | 78.0 | 0.0 | 0.0 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.414 | 2.0 | 4 | 1.6542 | 0.0058 | 713.5692 | 494.6085 | 76.0 | 299.0 | 0.2542 | 76.0 | 0.2542 | 0.0 | 0.0 | 64.0 | 0.0 | 0.0 | 73.0 | 73.0 | 73.0 | 1.0 | 1.0 | 3.0 | 3.0 | 78.0 | 0.0385 | 0.0385 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.2629 | 3.0 | 6 | 1.7538 | 0.0058 | 756.5379 | 524.3921 | 64.0 | 299.0 | 0.2140 | 64.0 | 0.2140 | 64.0 | 64.0 | 64.0 | 1.0 | 1.0 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 0.0 | 0.0 | 78.0 | 0.0 | 0.0 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 1.123 | 4.0 | 8 | 1.5366 | 0.0058 | 662.8170 | 459.4297 | 67.0 | 299.0 | 0.2241 | 67.0 | 0.2241 | 54.0 | 54.0 | 64.0 | 0.8438 | 0.8438 | 0.0 | 0.0 | 73.0 | 0.0 | 0.0 | 13.0 | 13.0 | 78.0 | 0.1667 | 0.1667 | 0.0 | 0.0 | 83.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.837 | 5.0 | 10 | 1.4879 | 0.0058 | 641.8253 | 444.8794 | 86.0 | 299.0 | 0.2876 | 86.0 | 0.2876 | 28.0 | 28.0 | 64.0 | 0.4375 | 0.4375 | 1.0 | 1.0 | 73.0 | 0.0137 | 0.0137 | 45.0 | 45.0 | 78.0 | 0.5769 | 0.5769 | 12.0 | 12.0 | 83.0 | 0.1446 | 0.1446 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.2588 | 6.0 | 12 | 2.4117 | 0.0058 | 1040.3208 | 721.0954 | 84.0 | 299.0 | 0.2809 | 78.0 | 0.2609 | 34.0 | 38.0 | 64.0 | 0.5938 | 0.5312 | 7.0 | 7.0 | 73.0 | 0.0959 | 0.0959 | 22.0 | 23.0 | 78.0 | 0.2949 | 0.2821 | 15.0 | 16.0 | 83.0 | 0.1928 | 0.1807 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.2513 | 7.0 | 14 | 2.9242 | 0.0058 | 1261.4202 | 874.3499 | 110.0 | 299.0 | 0.3679 | 106.0 | 0.3545 | 15.0 | 18.0 | 64.0 | 0.2812 | 0.2344 | 28.0 | 28.0 | 73.0 | 0.3836 | 0.3836 | 39.0 | 40.0 | 78.0 | 0.5128 | 0.5 | 24.0 | 24.0 | 83.0 | 0.2892 | 0.2892 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.6519 | 8.0 | 16 | 3.6514 | 0.0058 | 1575.0833 | 1091.7645 | 108.0 | 299.0 | 0.3612 | 103.0 | 0.3445 | 12.0 | 15.0 | 64.0 | 0.2344 | 0.1875 | 46.0 | 47.0 | 73.0 | 0.6438 | 0.6301 | 34.0 | 35.0 | 78.0 | 0.4487 | 0.4359 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.001 | 9.0 | 18 | 4.4460 | 0.0058 | 1917.8702 | 1329.3663 | 106.0 | 299.0 | 0.3545 | 100.0 | 0.3344 | 16.0 | 20.0 | 64.0 | 0.3125 | 0.25 | 41.0 | 42.0 | 73.0 | 0.5753 | 0.5616 | 33.0 | 34.0 | 78.0 | 0.4359 | 0.4231 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0001 | 10.0 | 20 | 5.6543 | 0.0058 | 2439.0822 | 1690.6429 | 106.0 | 299.0 | 0.3545 | 101.0 | 0.3378 | 23.0 | 27.0 | 64.0 | 0.4219 | 0.3594 | 37.0 | 37.0 | 73.0 | 0.5068 | 0.5068 | 31.0 | 32.0 | 78.0 | 0.4103 | 0.3974 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0001 | 11.0 | 22 | 6.6275 | 0.0058 | 2858.8977 | 1981.6369 | 105.0 | 299.0 | 0.3512 | 104.0 | 0.3478 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 35.0 | 35.0 | 73.0 | 0.4795 | 0.4795 | 29.0 | 30.0 | 78.0 | 0.3846 | 0.3718 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 12.0 | 24 | 7.3259 | 0.0058 | 3160.1247 | 2190.4315 | 102.0 | 299.0 | 0.3411 | 101.0 | 0.3378 | 29.0 | 29.0 | 64.0 | 0.4531 | 0.4531 | 32.0 | 33.0 | 73.0 | 0.4521 | 0.4384 | 29.0 | 29.0 | 78.0 | 0.3718 | 0.3718 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 13.0 | 26 | 7.8958 | 0.0058 | 3405.9778 | 2360.8439 | 100.0 | 299.0 | 0.3344 | 99.0 | 0.3311 | 30.0 | 30.0 | 64.0 | 0.4688 | 0.4688 | 29.0 | 29.0 | 73.0 | 0.3973 | 0.3973 | 28.0 | 29.0 | 78.0 | 0.3718 | 0.3590 | 12.0 | 12.0 | 83.0 | 0.1446 | 0.1446 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 14.0 | 28 | 8.2528 | 0.0058 | 3559.9558 | 2467.5733 | 97.0 | 299.0 | 0.3244 | 97.0 | 0.3244 | 31.0 | 31.0 | 64.0 | 0.4844 | 0.4844 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 12.0 | 12.0 | 83.0 | 0.1446 | 0.1446 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 15.0 | 30 | 8.4984 | 0.0058 | 3665.9234 | 2541.0245 | 96.0 | 299.0 | 0.3211 | 96.0 | 0.3211 | 32.0 | 32.0 | 64.0 | 0.5 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 16.0 | 32 | 8.7216 | 0.0058 | 3762.2008 | 2607.7589 | 97.0 | 299.0 | 0.3244 | 97.0 | 0.3244 | 32.0 | 32.0 | 64.0 | 0.5 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 17.0 | 34 | 8.8293 | 0.0058 | 3808.6478 | 2639.9535 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 18.0 | 36 | 8.9085 | 0.0058 | 3842.8136 | 2663.6354 | 98.0 | 299.0 | 0.3278 | 97.0 | 0.3244 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 19.0 | 38 | 8.9689 | 0.0058 | 3868.8669 | 2681.6942 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 20.0 | 40 | 8.9820 | 0.0058 | 3874.5370 | 2685.6244 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 21.0 | 42 | 9.0110 | 0.0058 | 3887.0514 | 2694.2987 | 95.0 | 299.0 | 0.3177 | 94.0 | 0.3144 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 26.0 | 26.0 | 73.0 | 0.3562 | 0.3562 | 26.0 | 26.0 | 78.0 | 0.3333 | 0.3333 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 22.0 | 44 | 9.0653 | 0.0058 | 3910.4803 | 2710.5384 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 23.0 | 46 | 9.0546 | 0.0058 | 3905.8529 | 2707.3309 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 24.0 | 48 | 9.0612 | 0.0058 | 3908.7044 | 2709.3074 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 25.0 | 50 | 9.0434 | 0.0058 | 3901.0103 | 2703.9743 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 26.0 | 52 | 9.0563 | 0.0058 | 3906.5620 | 2707.8224 | 98.0 | 299.0 | 0.3278 | 97.0 | 0.3244 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 27.0 | 54 | 9.0366 | 0.0058 | 3898.0915 | 2701.9511 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 28.0 | 56 | 9.0691 | 0.0058 | 3912.0940 | 2711.6569 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 29.0 | 58 | 9.0650 | 0.0058 | 3910.3459 | 2710.4453 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 30.0 | 60 | 9.0651 | 0.0058 | 3910.3806 | 2710.4693 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 31.0 | 62 | 9.0466 | 0.0058 | 3902.3980 | 2704.9362 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 26.0 | 26.0 | 73.0 | 0.3562 | 0.3562 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 32.0 | 64 | 9.0608 | 0.0058 | 3908.5396 | 2709.1932 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 26.0 | 26.0 | 73.0 | 0.3562 | 0.3562 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 33.0 | 66 | 9.0827 | 0.0058 | 3917.9843 | 2715.7397 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 34.0 | 68 | 9.0744 | 0.0058 | 3914.3706 | 2713.2349 | 98.0 | 299.0 | 0.3278 | 97.0 | 0.3244 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 11.0 | 11.0 | 83.0 | 0.1325 | 0.1325 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 35.0 | 70 | 9.0704 | 0.0058 | 3912.6593 | 2712.0488 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 36.0 | 72 | 9.0978 | 0.0058 | 3924.4855 | 2720.2460 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| 0.0 | 37.0 | 74 | 9.0858 | 0.0058 | 3919.3165 | 2716.6632 | 97.0 | 299.0 | 0.3244 | 96.0 | 0.3211 | 32.0 | 33.0 | 64.0 | 0.5156 | 0.5 | 27.0 | 27.0 | 73.0 | 0.3699 | 0.3699 | 27.0 | 27.0 | 78.0 | 0.3462 | 0.3462 | 10.0 | 10.0 | 83.0 | 0.1205 | 0.1205 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 1
Model tree for donoway/ARC-Challenge_Llama-3.2-1B-2k4vxbc8
Base model
meta-llama/Llama-3.2-1B