FoundryAILabs/k12-indian-curriculum-4.9m
Viewer • Updated • 3.25M • 377 • 1
How to use FoundryAILabs/bharat-marathi-7b-lora with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("unsloth/mistral-7b-instruct-v0.3-bnb-4bit")
model = PeftModel.from_pretrained(base_model, "FoundryAILabs/bharat-marathi-7b-lora")A QLoRA adapter for Mistral-7B, fine-tuned on CBSE/NCERT K-12 curriculum data in Marathi (मराठी).
Part of the BharatLLM project: 13 LoRA adapters (12 K-12 languages + 1 BTech Engineering).
| Property | Value |
|---|---|
| Base Model | mistralai/Mistral-7B-Instruct-v0.3 |
| Method | QLoRA (4-bit quantization + LoRA, r=64) |
| Trainable Parameters | 167,772,160 (2.26% of 7.4B) |
| Training Library | Unsloth |
| Language | Marathi (मराठी) |
| Domain | K-12 Education (CBSE/NCERT, Grades 6-12) |
| Training Data | ~408K curriculum-aligned Q&A pairs |
| License | Apache 2.0 |
from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(
model_name="FoundryAILabs/bharat-marathi-7b-lora",
max_seq_length=2048,
load_in_4bit=True,
)
FastLanguageModel.for_inference(model)
inputs = tokenizer("[INST] What is photosynthesis? [/INST]", return_tensors="pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer
base = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3", load_in_4bit=True, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3")
model = PeftModel.from_pretrained(base, "FoundryAILabs/bharat-marathi-7b-lora")
| Model | Language | Type |
|---|---|---|
| FoundryAILabs/bharat-english-7b-lora | English | K-12 |
| FoundryAILabs/bharat-hindi-7b-lora | Hindi | K-12 |
| FoundryAILabs/bharat-bengali-7b-lora | Bengali | K-12 |
| FoundryAILabs/bharat-telugu-7b-lora | Telugu | K-12 |
| FoundryAILabs/bharat-tamil-7b-lora | Tamil | K-12 |
| FoundryAILabs/bharat-kannada-7b-lora | Kannada | K-12 |
| FoundryAILabs/bharat-malayalam-7b-lora | Malayalam | K-12 |
| FoundryAILabs/bharat-marathi-7b-lora | Marathi | K-12 |
| FoundryAILabs/bharat-gujarati-7b-lora | Gujarati | K-12 |
| FoundryAILabs/bharat-odia-7b-lora | Odia | K-12 |
| FoundryAILabs/bharat-punjabi-7b-lora | Punjabi | K-12 |
| FoundryAILabs/bharat-urdu-7b-lora | Urdu | K-12 |
| FoundryAILabs/bharat-btech-7b-lora | English | BTech Engineering |
Website: foundryailabs.io | GitHub: github.com/foundryailabs/BharatLLM
Base model
mistralai/Mistral-7B-v0.3