File size: 67 Bytes
4714137
 
 
 
 
1
2
3
4
5
{
  "timesteps": 414000,
  "model_class": "PPO",
  "eval_idx": 23
}