PPO / results.json
Phani0404's picture
Upload folder using huggingface_hub
f6023a0 verified
raw
history blame contribute delete
171 Bytes
{"env_id": "LunarLander-v2", "mean_reward": 79.36196285043475, "std_reward": 96.13923650939194, "n_evaluation_episodes": 10, "eval_datetime": "2023-02-19T16:21:03.451435"}