ppo-LunarLander-v2-2m / results.json
Thanis33's picture
First attempt with PPO 1.5M
a935c69
{"mean_reward": 273.71884860392777, "std_reward": 19.84927247134027, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-12T12:32:12.981502"}