ppo-LunarLander-v2 / results.json
steffel's picture
LunarLander v2 using PPO
88f5933
raw
history blame contribute delete
164 Bytes
{"mean_reward": 282.37028892545044, "std_reward": 19.15943078899025, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-09T17:47:48.085488"}