ppo-Huggy / config.json

Commit History

First PPO training on Huggy env
2df507e
verified

Xharos commited on