--- library_name: peft license: bigcode-openrail-m base_model: bigcode/starcoderbase-1b tags: - generated_from_trainer model-index: - name: starcoder-peft-airscript results: [] --- # starcoder-peft-airscript This model is a fine-tuned version of [bigcode/starcoderbase-1b](https://huggingface.co/bigcode/starcoderbase-1b) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7155 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 10 - eval_batch_size: 10 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 20 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 30 - training_steps: 1700 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 1.2691 | 0.0588 | 100 | 1.1636 | | 0.9655 | 0.1176 | 200 | 0.9384 | | 0.8138 | 0.1765 | 300 | 0.8387 | | 0.719 | 0.2353 | 400 | 0.7847 | | 0.6408 | 0.2941 | 500 | 0.7503 | | 0.5788 | 0.3529 | 600 | 0.7314 | | 0.5386 | 0.4118 | 700 | 0.7168 | | 0.4894 | 0.4706 | 800 | 0.7156 | | 0.4583 | 0.5294 | 900 | 0.7101 | | 0.4271 | 0.5882 | 1000 | 0.7070 | | 0.4053 | 0.6471 | 1100 | 0.7117 | | 0.3934 | 0.7059 | 1200 | 0.7123 | | 0.379 | 0.7647 | 1300 | 0.7143 | | 0.3666 | 0.8235 | 1400 | 0.7171 | | 0.363 | 0.8824 | 1500 | 0.7171 | | 0.3588 | 0.9412 | 1600 | 0.7163 | | 0.357 | 1.0 | 1700 | 0.7155 | ### Framework versions - PEFT 0.13.2 - Transformers 4.45.2 - Pytorch 2.5.0 - Datasets 3.0.1 - Tokenizers 0.20.1