ppo-LunarLander-v2 / results.json
Maxiew's picture
test commit
81422b7
raw
history blame contribute delete
165 Bytes
{"mean_reward": 44.818117058232126, "std_reward": 141.65438990503193, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-04T15:40:49.758913"}