a2c-PandaReachDense-v2 / results.json
dmenini's picture
Initial commit
d0ba5e7
raw
history blame contribute delete
166 Bytes
{"mean_reward": -0.9586081984918564, "std_reward": 0.5167007926031151, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-04-02T11:27:33.675246"}