ppo-MountainCar-v0 / saved_models /ppo-MountainCar-v0-S6-best

Commit History

PPO playing MountainCar-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/5598ebc4b03054f16eebe76792486ba7bcacfc5c
68e589c

sgoodfriend commited on