ppo-LunarLander-v2 / results.json
ivi137's picture
Upload PPO LunarLander-v2
988d112
raw
history blame contribute delete
165 Bytes
{"mean_reward": 268.90152354514396, "std_reward": 12.661240527098506, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-30T19:59:31.885894"}