Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=500_seed=123 LoRA model e70745c verified mciccone commited on Jun 10, 2025
Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr5e-05_data_size1000_max_steps=500_seed=123 LoRA model 79793e7 verified mciccone commited on Jun 10, 2025
Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=100_seed=123 LoRA model 07467f8 verified mciccone commited on Jun 10, 2025
Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=100_seed=123 LoRA model 1f610f1 verified mciccone commited on Jun 10, 2025
Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=100_seed=123 LoRA model 33c4803 verified mciccone commited on Jun 10, 2025
Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=500_seed=123 LoRA model 4b00ad7 verified mciccone commited on Jun 10, 2025
Add llama_finetune_wsc_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=500_seed=123 LoRA model 8d99171 verified mciccone commited on Jun 10, 2025