ppo-LunarLander-v2-1m / ppo-LunarLander-1m /_stable_baselines3_version
Thanis33's picture
First attempt with PPO 1M
9717323
1.6.2