ppo-LunarLander-v2 / results.json
andge's picture
First ppo agent upload
04e6894
raw
history blame contribute delete
164 Bytes
{"mean_reward": 279.8227473254311, "std_reward": 13.287478258147042, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-16T14:16:59.918516"}