new-PPO-LunarLander-v2 / config.json

Commit History

PPO trained on 500,000 steps.
e2eaf0e

EvanMath commited on