Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=500_seed=123 LoRA model 8fda9f9 verified mciccone commited on Jun 10, 2025
Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr0.0003_data_size1000_max_steps=100_seed=123 LoRA model 808315d verified mciccone commited on Jun 10, 2025
Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr5e-05_data_size1000_max_steps=500_seed=123 LoRA model 3eb83ae verified mciccone commited on Jun 10, 2025
Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=500_seed=123 LoRA model c0ee010 verified mciccone commited on Jun 10, 2025
Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=500_seed=123 LoRA model bdf51d5 verified mciccone commited on Jun 10, 2025
Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr0.0001_data_size1000_max_steps=100_seed=123 LoRA model 4e4e6b5 verified mciccone commited on Jun 10, 2025
Add llama_finetune_sst2_r16_alpha=32_dropout=0.05_lr0.0002_data_size1000_max_steps=100_seed=123 LoRA model f25d649 verified mciccone commited on Jun 10, 2025