LunarLander-v2-ppo / ppo-LunarLander-v2 /pytorch_variables.pth

Commit History

first try of PPO applied to LunarLander-v2
07ddc30

giobin commited on