LoRA: Low-Rank Adaptation of Large Language Models
Paper
•
2106.09685
•
Published
•
57
This model generates Multiple Choice Questions (MCQs) from academic-style paragraphs. It is fine-tuned using LoRA on top of mistralai/Mistral-7B-Instruct-v0.1 using a custom dataset of educational instructions and responses.
mistralai/Mistral-7B-Instruct-v0.1peftbitsandbytes)from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
from peft import PeftModel
base = "mistralai/Mistral-7B-Instruct-v0.1"
adapter = "Lingesh-S/mcq-mistral-lora"
tokenizer = AutoTokenizer.from_pretrained(base)
tokenizer.pad_token = tokenizer.eos_token
model = AutoModelForCausalLM.from_pretrained(
base,
device_map="auto",
load_in_8bit=True,
quantization_config={
"load_in_8bit": True,
"llm_int8_enable_fp32_cpu_offload": True
},
offload_folder="./offload"
)
model = PeftModel.from_pretrained(model, adapter)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
prompt = """
# Instruction:
Generate a multiple choice question with 4 options and one correct answer based on the paragraph below.
Paragraph: The Lok Sabha is the House of the People in India. It is one of the two houses of Parliament.
# Response:
"""
output = pipe(prompt, max_new_tokens=150, do_sample=True, temperature=0.5)
print(output[0]["generated_text"])
What is the name of the House of the People in India?
a) The Rajya Sabha
b) The Lok Sabha
c) The Supreme Court
d) The President's House
Correct answer: b) The Lok Sabha
Dataset: Custom JSONL of 500 examples (Paragraph → MCQ)
Epochs: 3
Batch size: 1
Loss: ~0.23
Adapter size: 13.6MB
LoRA Config:
r=8lora_alpha=32dropout=0.05target_modules=['q_proj', 'v_proj']mistralai/Mistral-7B-Instruct-v0.1Lingesh-S/mcq-mistral-lora)| Use Case | Status |
|---|---|
| MCQ generation for education | ✅ Intended |
| Chat-style assistants | ✅ Possible |
| Factual question generation | ⚠️ Needs review |
| Medical/legal MCQs | ❌ Not recommended |
This model:
Model developed and shared by Lingesh S Contact via Hugging Face or LinkedIn
@misc{lingesh2024mcq,
title={MCQ Generator Fine-Tuned on Mistral-7B via LoRA},
author={Lingesh S},
year={2024},
url={https://huggingface.co/Lingesh-S/mcq-mistral-lora}
}
Base model
mistralai/Mistral-7B-v0.1