ppo-LunarLander-v2 / README.md

Commit History

Uploading my first RL agent with architecture PPO and environment LunarLander-v2
fa91bc1
verified

Sam017 commited on