ppo-SnowballTarget / configuration.yaml

Commit History

First training of PPO agent on SnowballTarget environment.
90e9ddb

atorre commited on