ppo-MountainCar-v0 / README.md

Commit History

Upload PPO Mountain Car agent trained for 10M steps with default hyperparameters
47c9a67

agercas commited on