ppo-MountainCar-v0 / selfplay_enjoy.py

Commit History

PPO playing MountainCar-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/983cb75e43e51cf4ef57f177194ab9a4a1a8808b
3cc5c1d

sgoodfriend commited on