ppo-LunarLander-v2 / results.json
amal94's picture
Second LunarLander-v2 PPO model: mean_reward=285.66 +/- 22.56
4e348eb
raw
history blame
164 Bytes
{"mean_reward": 284.3679751531863, "std_reward": 21.798279846577735, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-06T18:03:16.324154"}