Omarito101's picture
Upload Ministral-8B STEM Energy LoRA adapter
e2c9b4b verified
metadata
license: apache-2.0
base_model: mistralai/Ministral-8B-Instruct-2410
tags:
  - lora
  - peft
  - energy
  - stem

Ministral-8B STEM Energy LoRA

Fine-tuned LoRA adapter for Ministral-8B on STEM Energy tasks.

Model Details

  • Base model: mistralai/Ministral-8B-Instruct-2410
  • Dataset: EnergyAI/stem_energy
  • Training method: LoRA (Low-Rank Adaptation)

LoRA Configuration

  • r: 64
  • alpha: 128
  • dropout: 0.05
  • Checkpoint: training/sft/logs/stem-min8b

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer

base_model = AutoModelForCausalLM.from_pretrained("mistralai/Ministral-8B-Instruct-2410")
model = PeftModel.from_pretrained(base_model, "EnergyAI/stem-energy-ministral8")
tokenizer = AutoTokenizer.from_pretrained("mistralai/Ministral-8B-Instruct-2410")