ppo-lunar-lander / config.json
ketencrypt10n's picture
Upload folder using huggingface_hub
20c15f2 verified
{
"model_type": "PPO",
"state_dim": 8,
"action_dim": 4,
"actor_hidden_size": 256,
"critic_hidden_size": 256,
"environment": "LunarLander-v3",
"framework": "pytorch",
"architecture": "separate_actor_critic"
}