PPO-LunarLander-v2 / README.md

Commit History

Push agent to the Hub
d060e1d

stinoco commited on

Adding PPO model for solving LunarLander-v2
64c188b

stinoco commited on