ds-lora / stats.json
demonwizard0's picture
Upload folder using huggingface_hub
68418bb verified
raw
history blame contribute delete
362 Bytes
{"world_size": 8, "epochs": 1, "steps": 11, "seqs": 248, "tokens": 29976, "last_epoch_steps": 0, "last_epoch_seqs": 0, "last_epoch_tokens": 0, "total_seqs": 248, "nan_in_loss_seqs": 0, "experiment_tracking_run_id": null, "loss_ema": 1.045212610201402, "loss_sum": 11.497338712215424, "mtp_loss_ema": 0, "mtp_loss_sum": 0, "eval_losses_avg": [0.6676790714263916]}