ppo-Pendulum-v1 / config.yml
araffin's picture
Initial Commit
7e1fcbf verified
raw
history blame contribute delete
367 Bytes
!!python/object/apply:collections.OrderedDict
- - - clip_range
- 0.2
- - ent_coef
- 0.0
- - gae_lambda
- 0.95
- - gamma
- 0.9
- - learning_rate
- 0.001
- - n_envs
- 4
- - n_epochs
- 10
- - n_steps
- 1024
- - n_timesteps
- 100000.0
- - policy
- MlpPolicy
- - sde_sample_freq
- 4
- - use_sde
- true