Ppo-lunar-lander / results.json
markberry2010's picture
Commit
20b41a8 verified
raw
history blame contribute delete
158 Bytes
{"mean_reward": 218.8599409, "std_reward": 21.791553656541716, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-01-22T15:34:07.852175"}