ppo-LunarLander-v2 / README.md

Commit History

First PPO model on LunarLander-v2 env
f0d6908

santiviquez commited on