ppo-LunarLander-v2 / README.md

Commit History

Upload of improved PPO model on LunarLander-v2
1d892d1

NielsV commited on

Upload of trained PPO policy on LunarLander-v2
8a1d0c5

NielsV commited on