ValueError: Please specify `target_modules` in `peft_config`

#34
by Tapendra - opened

While training with SFTTrainer
trainer = SFTTrainer(
model=model,
train_dataset=dataset,
peft_config=peft_config,
dataset_text_field="text",
max_seq_length=max_seq_length,
tokenizer=tokenizer,
args=training_arguments,
packing=packing,
)

Train model

trainer.train()

@harpercarroll What is the minimum requirement of System RAM, GPU RAM to inference the bitsandbytes nf4 ,use_4bit = True train model. I train model of bnb_4bit_quant_type = "nf4" after training model size is same like 15GB. How we can reduce model Size ?

hey @Tapendra you might find Impulse AI (https://www.impulselabs.ai/) useful. we make it super easy to fine-tune and deploy open source models. hopefully you find it helpful! i know not relevant to your problem above but might be easier to use us to fine tune and deploy

docs: https://docs.impulselabs.ai/introduction
python sdk: https://pypi.org/project/impulse-api-sdk-python/

Sign up or log in to comment