ppo-LunarLander-v2-2m / ppo-LunarLander-2m-v1 /_stable_baselines3_version
Thanis33's picture
First attempt with PPO 1.5M
a935c69
1.6.2