LoRA adapter for test-time finetuning on the demonstration input/output pairs of the public evaluation set for ARC-AGI 2024. This adapter is meant to be used on top of the initial finetuned base model https://huggingface.co/paranke/Mistral-NeMo-Minitron-8B-arc-training.
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for paranke/Mistral-NeMo-Minitron-8B-arc-eval-adapter
Base model
nvidia/Mistral-NeMo-Minitron-8B-Base