ppo-BipedalWalker-v3 / config.yml
araffin's picture
Initial commit
8e927a4
!!python/object/apply:collections.OrderedDict
- - - batch_size
- 64
- - clip_range
- 0.18
- - ent_coef
- 0.0
- - gae_lambda
- 0.95
- - gamma
- 0.999
- - learning_rate
- 0.0003
- - n_envs
- 32
- - n_epochs
- 10
- - n_steps
- 2048
- - n_timesteps
- 5000000.0
- - normalize
- true
- - policy
- MlpPolicy