Long-context Non-factoid Question Answering in Indic Languages
Paper • 2504.13615 • Published
How to use ritwikm/IndicGenQA-B-gemma-2b-it with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("google/gemma-2b-it")
model = PeftModel.from_pretrained(base_model, "ritwikm/IndicGenQA-B-gemma-2b-it")This is the finetuned adapter of the baseline (B) from the IndicGenQA paper. It is finetuned on top of google/gemma-2b-it.