File size: 133 Bytes
846d793
 
 
 
 
 
 
1
2
3
4
5
6
7
{
  "algorithm": "PPO",
  "total_timesteps": 150000,
  "learning_rate": 0.0003,
  "gamma": 0.99,
  "training_candles": 100000
}