rl_course / ppo-LunarLander-v2 /policy.optimizer.pth

Commit History

Adding my solution using PPO
fb5fc45

dvaleriani commited on

Adding my solution using PPO
e7c2916

dvaleriani commited on