ppo-LunarLander-v2 / results.json
ManarAli's picture
First commit for unit1 (LunarLander with PPO)
f0be1c6
raw
history blame contribute delete
165 Bytes
{"mean_reward": 246.62468388840793, "std_reward": 20.824449319638195, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-20T20:54:14.039770"}