ppo-LunarLander-v2 / config.json

Commit History

Upload learn model lunar lander v2 trained with PPO
b5143cd
verified

xXrobroXx commited on