Danfe AI โ€” Nepal Language Model

Nepal's first open-source AI model for 17+ languages.

Model Details

  • Base Model: HuggingFaceTB/SmolLM2-1.7B-Instruct
  • Method: QLoRA (4-bit quantization + LoRA rank 16)
  • Training Data: Nepali Wikipedia + curated Nepal knowledge
  • Languages: Nepali, English, Maithili, Doteli, Bajhangi, Achhami, +11 more

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

base = AutoModelForCausalLM.from_pretrained("HuggingFaceTB/SmolLM2-1.7B-Instruct")
model = PeftModel.from_pretrained(base, "danfe-ai/danfe-nepali-2b")
tokenizer = AutoTokenizer.from_pretrained("danfe-ai/danfe-nepali-2b")

Part of Danfe AI Ecosystem

  • Chat UI, Translation, Speech, OCR
  • Crowdsource training platform
  • Focus on Sudurpashchim languages (Doteli, Bajhangi, Achhami)

Built with love for Nepal.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for danfe-ai/danfe-nepali-2b

Finetuned
(111)
this model