ppo-LunarLander-v2 / results.json
djib2011's picture
First attempt
4c44cb7
raw
history blame contribute delete
165 Bytes
{"mean_reward": 245.38552481913104, "std_reward": 21.769315216746246, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-02T14:01:42.172403"}