my-ppo-LunarLander / results.json
moghis's picture
first commit
7e3c90c
raw
history blame contribute delete
165 Bytes
{"mean_reward": 235.66295435690836, "std_reward": 26.092543565714863, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-05-11T04:34:26.063578"}