ppo-LunarLander-v2 / README.md

Commit History

Upload of PPO trained agent using stable_baselines3
534b932
verified

shazeghi commited on