ppo-ReachCube-v0 / config.yml
qgallouedec's picture
qgallouedec HF Staff
Initial commit
04924ec
raw
history blame contribute delete
163 Bytes
!!python/object/apply:collections.OrderedDict
- - - n_envs
- 16
- - n_timesteps
- 1000000.0
- - policy
- MultiInputPolicy
- - use_sde
- true