ppo-LunarLander-v2 / results.json
UmberH's picture
Upload my updated PPO model to the hub
b9c2e41
raw
history blame contribute delete
163 Bytes
{"mean_reward": 170.42348065398036, "std_reward": 60.4287102897718, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-09-27T10:46:19.958904"}